Pose estimation detects 17 body keypoints per person — nose, eyes, shoulders, elbows, wrists, hips, knees, and ankles — at 40+ FPS. Use it for sports analysis, ergonomics, fall detection, or gesture control.
What you will learn
- How pose estimation works — keypoints vs bounding boxes
- How to detect 17-point body skeletons in real time
- How to calculate joint angles (e.g. elbow bend, knee flexion)
- How to classify actions from pose (standing, sitting, falling)
- Applications: sports coaching, workplace ergonomics, fall detection
Step 1 — Run the pose demo
cd ~/tutorials/03-pose-estimation
python3 pose.py --source 0 --show
You will see a skeleton drawn over every person in frame. Each of the 17 keypoints is shown as a coloured dot, connected by lines representing limbs.
Step 2 — Access keypoint coordinates
from ultralytics import YOLO
model = YOLO("yolov8n-pose.engine", task="pose")
results = model(frame)
for person in results[0].keypoints.data:
# 17 keypoints, each with [x, y, confidence]
nose = person[0] # [x, y, conf]
left_wrist = person[9]
right_wrist= person[10]
print(f"Nose position: ({nose[0]:.0f}, {nose[1]:.0f})")
Step 3 — Calculate a joint angle
import numpy as np
def joint_angle(a, b, c):
"""Calculate angle at joint b, given points a, b, c"""
ba = np.array(a) - np.array(b)
bc = np.array(c) - np.array(b)
cosine = np.dot(ba, bc) / (np.linalg.norm(ba) * np.linalg.norm(bc))
return np.degrees(np.arccos(np.clip(cosine, -1.0, 1.0)))
# Example: elbow angle (shoulder → elbow → wrist)
shoulder = person[5][:2].numpy()
elbow = person[7][:2].numpy()
wrist = person[9][:2].numpy()
angle = joint_angle(shoulder, elbow, wrist)
print(f"Elbow angle: {angle:.1f}°")
Step 4 — Simple fall detection
def is_fallen(keypoints):
"""Simple fall detection — head near the same height as hips"""
nose = keypoints[0]
hip_l = keypoints[11]
hip_r = keypoints[12]
hip_y = (hip_l[1] + hip_r[1]) / 2
# If head is below hips level, likely fallen
return abs(nose[1] - hip_y) < 60
✅ Next: Tutorial 4 — AI Plant Health Monitor | Back to Jetson Kit