Spaces:
Sleeping
Sleeping
JawTrack
JawTrack is a real-time jaw motion analysis system that uses computer vision to track and analyze jaw movements. Built with MediaPipe and OpenCV, it provides quantitative measurements for jaw motion assessment.
Features
- Real-time jaw motion tracking
- Video-based analysis
- Quantitative measurements:
- Jaw opening distance
- Lateral deviation
- Movement patterns
- Data visualization
- Assessment reports
- CSV data export
Requirements
- Python 3.10+
- OpenCV
- MediaPipe
- Gradio
- NumPy
- Pandas
- Matplotlib
Installation
- Clone the repository:
git clone https://github.com/yourusername/jawtrack.git
cd jawtrack
- Create a virtual environment:
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
Usage
- Start the application:
python app.py
- Open your web browser and navigate to:
http://localhost:7860
- Upload a video or use webcam for real-time analysis
Development Setup
- Install development dependencies:
pip install -r requirements-dev.txt
- Run tests:
pytest tests/
Project Structure
jawtrack/
βββ README.md
βββ requirements.txt
βββ setup.py
βββ jawtrack/
β βββ core/
β βββ analysis/
β βββ ui/
βββ tests/
βββ examples/
Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Authors
- Your Name - Initial work
Acknowledgments
- MediaPipe team for face mesh implementation
- OpenCV community
- Gradio team for the web interface framework