hg-jaw-movement / README.md
vitorcalvi's picture
Jaw Initial
27d4540
|
raw
history blame
1.85 kB

JawTrack

JawTrack is a real-time jaw motion analysis system that uses computer vision to track and analyze jaw movements. Built with MediaPipe and OpenCV, it provides quantitative measurements for jaw motion assessment.

Features

  • Real-time jaw motion tracking
  • Video-based analysis
  • Quantitative measurements:
    • Jaw opening distance
    • Lateral deviation
    • Movement patterns
  • Data visualization
  • Assessment reports
  • CSV data export

Requirements

  • Python 3.10+
  • OpenCV
  • MediaPipe
  • Gradio
  • NumPy
  • Pandas
  • Matplotlib

Installation

  1. Clone the repository:
git clone https://github.com/yourusername/jawtrack.git
cd jawtrack
  1. Create a virtual environment:
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
  1. Install dependencies:
pip install -r requirements.txt

Usage

  1. Start the application:
python app.py
  1. Open your web browser and navigate to:
http://localhost:7860
  1. Upload a video or use webcam for real-time analysis

Development Setup

  1. Install development dependencies:
pip install -r requirements-dev.txt
  1. Run tests:
pytest tests/

Project Structure

jawtrack/
β”œβ”€β”€ README.md
β”œβ”€β”€ requirements.txt
β”œβ”€β”€ setup.py
β”œβ”€β”€ jawtrack/
β”‚   β”œβ”€β”€ core/
β”‚   β”œβ”€β”€ analysis/
β”‚   └── ui/
β”œβ”€β”€ tests/
└── examples/

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push to the branch
  5. Create a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Authors

  • Your Name - Initial work

Acknowledgments

  • MediaPipe team for face mesh implementation
  • OpenCV community
  • Gradio team for the web interface framework