Spaces:
Runtime error
Runtime error
File size: 1,556 Bytes
cf36560 cc2ce8c a193066 611845e ff01f3e cf36560 ff01f3e cf36560 ff01f3e a193066 cc2ce8c a193066 cc2ce8c a193066 93decd4 a193066 cc2ce8c a50da00 a193066 a50da00 a193066 a50da00 a193066 a50da00 a193066 a50da00 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 |
---
title: AnythingQ&A
emoji: ❓
colorFrom: blue
colorTo: red
sdk: gradio
sdk_version: 3.48.0
app_file: app.py
pinned: false
---
## Anything Q&A
This tool serves as a customizable version of the amazing ClimateQA by Ekimetrics. It allows you to rapidly create a new question-answering tool using any set of documents as a data source.
## Quick Start
0. **Fork the Repository**
Fork the original HuggingFace space: [Anything Q&A on HuggingFace](https://huggingface.co/spaces/LouisSanna/anything-question-answering).
1. **Set environment variables**
For OpenAI API:
```env
OPENAI_API_KEY=...
```
2. **Install Dependencies**
```shell
pip install -r requirements.txt
```
3. **Add your data.**
Place the PDFs to be used as sources in the data folder. The subfolder and file names will be used as default identifiers by the tool.
```txt
data/
type_1/
source_1.pdf
source_2.pdf
type_2/
source_3.pdf
```
4. **Build the index of semantic vectors**
```bash
python -m anyqa.build_index
```
5. **Launch the app**
```shell
python app.py
```
And you're done!
## Deployment
Deploying via Hugging Face Spaces is the most straightforward approach. Simply push your code to a HuggingFace Gradio space, and it will function as is.
You will need to set the environments variables in the settings.
## AI providers
The following providers are currently supported:
- OpenAI
- Azure OpenAI
The provider is picked based on the variable variables, see `example.env`.
|