This is the code for pose estimation using Javascript within a HTML, generated by Claude.AI.
Once you save this file as a HTML. Open it using the browser, and give permission to use your web cam. It will start to detect poses if you activate it.
Is the video images being sent? No, the pose data stays completely on your device. This application runs entirely in your browser with no data transmission to external servers.
Internet access is needed as it uses the TensorFlow.js libraries from cdn.jsdelivr.net
The pose coordinates are processed locally by TensorFlow.js
All keypoint data: Stored only in browser memory
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Pose Data Viewer</title>
<style>
* {
margin: 0;
padding: 0;
box-sizing: border-box;
}
body {
font-family: 'SF Pro Display', -apple-system, BlinkMacSystemFont, sans-serif;
background: #f8fafc;
color: #1e293b;
line-height: 1.6;
}
.header {
background: white;
padding: 20px;
border-bottom: 1px solid #e2e8f0;
text-align: center;
}
.header h1 {
font-size: 24px;
font-weight: 600;
color: #0f172a;
}
.status {
display: inline-flex;
align-items: center;
gap: 8px;
margin-top: 8px;
padding: 6px 12px;
background: #f1f5f9;
border-radius: 20px;
font-size: 14px;
}
.status-dot {
width: 8px;
height: 8px;
border-radius: 50%;
background: #ef4444;
}
.status-dot.active {
background: #10b981;
}
.container {
max-width: 1200px;
margin: 0 auto;
padding: 20px;
display: grid;
grid-template-columns: 1fr 1fr;
gap: 30px;
align-items: start;
}
.video-section {
background: white;
border-radius: 12px;
padding: 20px;
box-shadow: 0 1px 3px rgba(0,0,0,0.1);
}
.video-container {
position: relative;
border-radius: 8px;
overflow: hidden;
background: #000;
}
#video, #output {
width: 100%;
height: auto;
transform: scaleX(-1);
display: block;
}
#output {
position: absolute;
top: 0;
left: 0;
}
.controls {
margin-top: 15px;
display: flex;
gap: 10px;
justify-content: center;
}
button {
background: #3b82f6;
color: white;
border: none;
padding: 10px 16px;
border-radius: 6px;
font-size: 13px;
font-weight: 500;
cursor: pointer;
transition: background 0.2s;
}
button:hover {
background: #2563eb;
}
button:disabled {
background: #94a3b8;
cursor: not-allowed;
}
.data-section {
background: white;
border-radius: 12px;
padding: 20px;
box-shadow: 0 1px 3px rgba(0,0,0,0.1);
height: fit-content;
}
.data-header {
display: flex;
justify-content: between;
align-items: center;
margin-bottom: 20px;
padding-bottom: 15px;
border-bottom: 1px solid #e2e8f0;
}
.data-header h2 {
font-size: 18px;
font-weight: 600;
}
.data-stats {
display: flex;
gap: 20px;
font-size: 12px;
color: #64748b;
}
.stat {
display: flex;
flex-direction: column;
align-items: center;
}
.stat-value {
font-size: 16px;
font-weight: 600;
color: #1e293b;
}
.data-display {
max-height: 500px;
overflow-y: auto;
border: 1px solid #e2e8f0;
border-radius: 8px;
background: #f8fafc;
}
.data-tabs {
display: flex;
border-bottom: 1px solid #e2e8f0;
background: white;
}
.tab {
padding: 12px 20px;
background: none;
border: none;
border-bottom: 2px solid transparent;
cursor: pointer;
font-size: 14px;
color: #64748b;
flex: 1;
}
.tab.active {
color: #3b82f6;
border-bottom-color: #3b82f6;
background: #f8fafc;
}
.tab-content {
padding: 20px;
font-family: 'SF Mono', Monaco, 'Cascadia Code', monospace;
font-size: 12px;
line-height: 1.5;
white-space: pre-wrap;
background: #f8fafc;
}
.keypoint-grid {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(200px, 1fr));
gap: 15px;
padding: 15px;
}
.keypoint-item {
background: white;
padding: 12px;
border-radius: 6px;
border-left: 3px solid #3b82f6;
}
.keypoint-name {
font-weight: 600;
color: #1e293b;
margin-bottom: 5px;
font-size: 13px;
}
.keypoint-coords {
font-family: 'SF Mono', Monaco, monospace;
font-size: 11px;
color: #64748b;
line-height: 1.4;
}
.confidence-bar {
width: 100%;
height: 4px;
background: #e2e8f0;
border-radius: 2px;
margin-top: 8px;
overflow: hidden;
}
.confidence-fill {
height: 100%;
background: linear-gradient(90deg, #ef4444, #f59e0b, #10b981);
border-radius: 2px;
transition: width 0.3s ease;
}
.no-data {
text-align: center;
color: #64748b;
padding: 40px 20px;
font-style: italic;
}
@media (max-width: 768px) {
.container {
grid-template-columns: 1fr;
gap: 20px;
}
}
</style>
</head>
<body>
<div class="header">
<h1>Pose Data Viewer</h1>
<div class="status">
<div class="status-dot" id="statusDot"></div>
<span id="statusText">Initializing...</span>
</div>
</div>
<div class="container">
<div class="video-section">
<div class="video-container">
<video id="video" width="480" height="360" autoplay muted playsinline></video>
<canvas id="output" width="480" height="360"></canvas>
</div>
<div class="controls">
<button id="toggleDetection" disabled>Start Detection</button>
<button id="testDetection" disabled>Test Single Frame</button>
<button id="exportData" disabled>Export Current Frame</button>
</div>
</div>
<div class="data-section">
<div class="data-header">
<h2>Live Pose Data</h2>
<div class="data-stats">
<div class="stat">
<div class="stat-value" id="frameRate">0</div>
<div>Render FPS</div>
</div>
<div class="stat">
<div class="stat-value" id="detectionRate">0</div>
<div>Detection FPS</div>
</div>
<div class="stat">
<div class="stat-value" id="detectedKeypoints">0</div>
<div>Keypoints</div>
</div>
<div class="stat">
<div class="stat-value" id="avgConfidence">0%</div>
<div>Confidence</div>
</div>
</div>
</div>
<div class="data-display">
<div class="data-tabs">
<button class="tab active" data-tab="visual">Visual</button>
<button class="tab" data-tab="json">JSON</button>
<button class="tab" data-tab="csv">Table</button>
</div>
<div class="tab-content" id="visualTab">
<div class="keypoint-grid" id="keypointGrid">
<div class="no-data">No pose detected</div>
</div>
</div>
<div class="tab-content" id="jsonTab" style="display: none;">
<div id="jsonData">No data available</div>
</div>
<div class="tab-content" id="csvTab" style="display: none;">
<div id="csvData">No data available</div>
</div>
</div>
</div>
</div>
(html comment removed: TensorFlow.js and Pose Detection )
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-core"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-converter"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/pose-detection"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-webgl"></script>
<script>
class SimplePoseViewer {
constructor() {
this.video = document.getElementById('video');
this.canvas = document.getElementById('output');
this.ctx = this.canvas.getContext('2d');
this.detector = null;
this.isDetecting = false;
this.currentPose = null;
this.frameCount = 0;
this.lastFrameTime = 0;
this.lastDetectionTime = 0;
this.fps = 0;
this.detectionFps = 0;
this.initializeElements();
this.setupTabs();
this.init();
}
initializeElements() {
this.statusDot = document.getElementById('statusDot');
this.statusText = document.getElementById('statusText');
this.toggleBtn = document.getElementById('toggleDetection');
this.testBtn = document.getElementById('testDetection');
this.exportBtn = document.getElementById('exportData');
this.frameRateEl = document.getElementById('frameRate');
this.detectionRateEl = document.getElementById('detectionRate');
this.detectedKeypointsEl = document.getElementById('detectedKeypoints');
this.avgConfidenceEl = document.getElementById('avgConfidence');
this.keypointGrid = document.getElementById('keypointGrid');
this.jsonData = document.getElementById('jsonData');
this.csvData = document.getElementById('csvData');
this.toggleBtn.addEventListener('click', () => this.toggleDetection());
this.testBtn.addEventListener('click', () => this.testSingleFrame());
this.exportBtn.addEventListener('click', () => this.exportCurrentFrame());
}
setupTabs() {
const tabs = document.querySelectorAll('.tab');
tabs.forEach(tab => {
tab.addEventListener('click', () => {
const tabName = tab.dataset.tab;
this.switchTab(tabName);
});
});
}
switchTab(tabName) {
// Update tab buttons
document.querySelectorAll('.tab').forEach(tab => {
tab.classList.toggle('active', tab.dataset.tab === tabName);
});
// Show/hide content
document.getElementById('visualTab').style.display = tabName === 'visual' ? 'block' : 'none';
document.getElementById('jsonTab').style.display = tabName === 'json' ? 'block' : 'none';
document.getElementById('csvTab').style.display = tabName === 'csv' ? 'block' : 'none';
}
async setupCamera() {
try {
console.log('Requesting camera access...');
const stream = await navigator.mediaDevices.getUserMedia({
video: {
width: 480,
height: 360,
frameRate: { ideal: 30, max: 60 }
}
});
this.video.srcObject = stream;
await new Promise(resolve => this.video.onloadedmetadata = resolve);
this.video.play();
console.log('Video playing, dimensions:', this.video.videoWidth, 'x', this.video.videoHeight);
return true;
} catch (error) {
console.error('Camera setup failed:', error);
this.updateStatus(`Camera access denied: ${error.message}`, false);
return false;
}
}
async init() {
try {
console.log('Starting initialization...');
await tf.setBackend('webgl');
console.log('TensorFlow backend set to webgl');
this.updateStatus('Setting up camera...', false);
const cameraReady = await this.setupCamera();
if (!cameraReady) return;
console.log('Camera setup complete');
this.updateStatus('Loading pose model...', false);
this.detector = await poseDetection.createDetector(
poseDetection.SupportedModels.MoveNet,
{
modelType: 'SinglePose.Lightning',
enableSmoothing: false,
minPoseScore: 0.3
}
);
console.log('Pose detector loaded');
this.updateStatus('Ready to detect poses', true);
this.toggleBtn.disabled = false;
this.testBtn.disabled = false;
this.toggleBtn.textContent = 'Start Detection';
} catch (error) {
console.error('Initialization failed:', error);
this.updateStatus(`Initialization failed: ${error.message}`, false);
}
}
updateStatus(text, active) {
this.statusText.textContent = text;
this.statusDot.className = `status-dot ${active ? 'active' : ''}`;
}
toggleDetection() {
if (this.isDetecting) {
this.stopDetection();
} else {
this.startDetection();
}
}
startDetection() {
console.log('Starting detection...');
if (!this.detector) {
console.error('Detector not initialized');
alert('Pose detector not ready. Please wait for initialization to complete.');
return;
}
if (!this.video.videoWidth || !this.video.videoHeight) {
console.error('Video not ready');
alert('Camera not ready. Please ensure camera permissions are granted.');
return;
}
this.isDetecting = true;
this.toggleBtn.textContent = 'Stop Detection';
this.exportBtn.disabled = false;
this.updateStatus('Detecting poses...', true);
console.log('Detection started, calling detect()');
this.detect();
}
stopDetection() {
this.isDetecting = false;
this.toggleBtn.textContent = 'Start Detection';
this.exportBtn.disabled = true;
this.updateStatus('Detection stopped', false);
this.ctx.clearRect(0, 0, this.canvas.width, this.canvas.height);
}
async testSingleFrame() {
if (!this.detector) {
alert('Detector not ready');
return;
}
try {
console.log('Testing single frame detection...');
this.updateStatus('Testing detection...', true);
const poses = await this.detector.estimatePoses(this.video);
console.log('Test result - poses found:', poses.length);
if (poses[0]) {
console.log('Test pose keypoints:', poses[0].keypoints.length);
this.currentPose = poses[0];
this.drawPose(poses[0]);
this.updateData(poses[0]);
this.updateStatus('Test successful - pose detected!', true);
} else {
console.log('Test - no pose detected');
this.updateStatus('Test complete - no pose detected', true);
}
setTimeout(() => {
this.updateStatus('Ready to detect poses', true);
}, 2000);
} catch (error) {
console.error('Test detection failed:', error);
this.updateStatus(`Test failed: ${error.message}`, false);
}
}
async detect() {
if (!this.isDetecting || !this.detector) {
console.log('Detection stopped or detector not ready');
return;
}
try {
const startTime = performance.now();
// Skip frames for better performance - only detect every 2nd frame
this.frameCount++;
if (this.frameCount % 2 === 0) {
const detectionStart = performance.now();
// Use the video directly for pose detection instead of smaller canvas
// This simplifies the process and avoids potential scaling issues
const poses = await this.detector.estimatePoses(this.video);
console.log('Poses detected:', poses.length);
if (poses[0]) {
this.currentPose = poses[0];
this.updateData(poses[0]);
console.log('Pose updated with', poses[0].keypoints.length, 'keypoints');
} else {
console.log('No pose detected in this frame');
}
this.updateDetectionFPS(detectionStart);
}
// Always draw the last known pose for smooth visualization
this.drawPose(this.currentPose);
this.updateRenderFPS(startTime);
} catch (error) {
console.error('Detection error:', error);
this.updateStatus(`Detection error: ${error.message}`, false);
this.stopDetection();
return;
}
if (this.isDetecting) {
requestAnimationFrame(() => this.detect());
}
}
drawPose(pose) {
this.ctx.clearRect(0, 0, this.canvas.width, this.canvas.height);
if (!pose) return;
const keypoints = pose.keypoints;
// Draw skeleton connections
const connections = [
[0, 1], [0, 2], [1, 3], [2, 4], [5, 6], [5, 7], [7, 9],
[6, 8], [8, 10], [5, 11], [6, 12], [11, 12], [11, 13],
[13, 15], [12, 14], [14, 16]
];
this.ctx.strokeStyle = '#3b82f6';
this.ctx.lineWidth = 2;
connections.forEach(([i, j]) => {
const kp1 = keypoints[i];
const kp2 = keypoints[j];
if (kp1.score > 0.3 && kp2.score > 0.3) {
this.ctx.beginPath();
this.ctx.moveTo(kp1.x, kp1.y);
this.ctx.lineTo(kp2.x, kp2.y);
this.ctx.stroke();
}
});
// Draw keypoints
keypoints.forEach(kp => {
if (kp.score > 0.3) {
this.ctx.beginPath();
this.ctx.arc(kp.x, kp.y, 4, 0, 2 * Math.PI);
this.ctx.fillStyle = this.getConfidenceColor(kp.score);
this.ctx.fill();
this.ctx.strokeStyle = 'white';
this.ctx.lineWidth = 1;
this.ctx.stroke();
}
});
}
getConfidenceColor(confidence) {
if (confidence > 0.8) return '#10b981';
if (confidence > 0.6) return '#f59e0b';
return '#ef4444';
}
updateData(pose) {
this.currentPose = pose;
if (pose) {
const validKeypoints = pose.keypoints.filter(kp => kp.score > 0.3);
const avgConfidence = validKeypoints.reduce((sum, kp) => sum + kp.score, 0) / validKeypoints.length;
this.detectedKeypointsEl.textContent = validKeypoints.length;
this.avgConfidenceEl.textContent = `${Math.round(avgConfidence * 100)}%`;
this.updateVisualTab(pose);
this.updateJSONTab(pose);
this.updateCSVTab(pose);
} else {
this.detectedKeypointsEl.textContent = '0';
this.avgConfidenceEl.textContent = '0%';
this.updateNoData();
}
}
updateVisualTab(pose) {
this.keypointGrid.innerHTML = '';
if (!pose) {
this.keypointGrid.innerHTML = '<div class="no-data">No pose detected</div>';
return;
}
pose.keypoints.forEach(kp => {
if (kp.score > 0.1) {
const item = document.createElement('div');
item.className = 'keypoint-item';
item.innerHTML = `
<div class="keypoint-name">${kp.name}</div>
<div class="keypoint-coords">
X: ${kp.x.toFixed(1)}px<br>
Y: ${kp.y.toFixed(1)}px<br>
Score: ${kp.score.toFixed(3)}
</div>
<div class="confidence-bar">
<div class="confidence-fill" style="width: ${kp.score * 100}%"></div>
</div>
`;
this.keypointGrid.appendChild(item);
}
});
}
updateJSONTab(pose) {
if (!pose) {
this.jsonData.textContent = 'No pose detected';
return;
}
const data = {
timestamp: Date.now(),
pose: {
score: pose.score,
keypoints: pose.keypoints.map(kp => ({
name: kp.name,
x: Math.round(kp.x * 100) / 100,
y: Math.round(kp.y * 100) / 100,
score: Math.round(kp.score * 1000) / 1000
}))
}
};
this.jsonData.textContent = JSON.stringify(data, null, 2);
}
updateCSVTab(pose) {
if (!pose) {
this.csvData.textContent = 'No pose detected';
return;
}
let csv = 'keypoint,x,y,score\n';
pose.keypoints.forEach(kp => {
csv += `${kp.name},${kp.x.toFixed(2)},${kp.y.toFixed(2)},${kp.score.toFixed(3)}\n`;
});
this.csvData.textContent = csv;
}
updateNoData() {
this.keypointGrid.innerHTML = '<div class="no-data">No pose detected</div>';
this.jsonData.textContent = 'No pose detected';
this.csvData.textContent = 'No pose detected';
}
updateRenderFPS(startTime) {
const endTime = performance.now();
const frameTime = endTime - this.lastFrameTime;
this.lastFrameTime = endTime;
if (frameTime > 0) {
this.fps = Math.round(1000 / frameTime);
this.frameRateEl.textContent = this.fps;
}
}
updateDetectionFPS(startTime) {
const endTime = performance.now();
const detectionTime = endTime - this.lastDetectionTime;
this.lastDetectionTime = endTime;
if (detectionTime > 0) {
this.detectionFps = Math.round(1000 / detectionTime);
this.detectionRateEl.textContent = this.detectionFps;
}
}
exportCurrentFrame() {
if (!this.currentPose) {
alert('No pose data to export');
return;
}
const data = {
timestamp: Date.now(),
fps: this.fps,
pose: this.currentPose
};
const blob = new Blob([JSON.stringify(data, null, 2)], { type: 'application/json' });
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
a.href = url;
a.download = `pose_frame_${Date.now()}.json`;
document.body.appendChild(a);
a.click();
document.body.removeChild(a);
URL.revokeObjectURL(url);
}
}
// Initialize when page loads
window.addEventListener('load', () => {
new SimplePoseViewer();
});
</script>
</body>
</html>
IMPORTANT DISCLAIMER & TERMS OF USE
⚠️ NO WARRANTY - USE AT YOUR OWN RISK
This pose estimation software is provided "AS IS" without warranty of any kind. By using this application, you acknowledge and agree to the following terms:
🚫 No Warranty
No express or implied warranties of merchantability, fitness for a particular purpose, or non-infringement
No guarantee of accuracy, reliability, or completeness of pose detection data
No warranty that the software will be error-free or operate without interruption
⚡ Limitation of Liability
Use entirely at your own risk - the developers assume no responsibility for any consequences
No liability for damages including but not limited to data loss, injury, or equipment damage
Not responsible for decisions made based on pose detection results
🏥 Not for Medical/Professional Use
Educational and experimental purposes only
Not intended for medical diagnosis, treatment, or professional analysis
Not suitable for safety-critical applications
Consult qualified professionals for medical or professional advice
🔒 Privacy & Data
All processing occurs locally in your browser
No data transmitted to external servers
User responsible for exported data security
⚠️ WARNING: By using this software, you acknowledge that you have read, understood, and agree to use this application entirely at your own risk.