metadata
license: cc-by-4.0
pretty_name: Chabiko Stream
new_version: WitchesSocialStream/Four-Leaf-Clover
Superseded
see WitchesSocialStream/Four-Leaf-Clover for the newer version!
CURRENTLY OFFLINE
I really didn't like the code I used for 4chan scraping. It was overly complex and prone to failures. So I'm pausing data collection on this for now while I think of a better solution.
Dataset Card for Chabiko Stream
"Rule 34 (Part AI). Anything that can be in a dataset, will be in one eventually given enough time." - KaraKaraWitch
XChan + Chibiko -> Chabiko
chabiko is a XChan scraper. and as such, Chabiko Stream is a daily dump of threads and posts of X chan posts.
ArchiverModes
Broadly, the Archiver archives in 2 modes:
- "Archive" [The board supports archives of closed threads.]
- Archives are refreshed at a max of 10 minutes. If boards are moving faster, refreshes happen more often.
- "PostStream" [The Board does not support archives. As such, fall back to Streaming as posts. (This is similar to MissingKeys.)]
- Streams are refreshed at a max of 20 seconds. If boards are moving faster, refreshes happen more often.
Supported Sites
- 4chan
- [TBD] Futaba Channel
Formats
Both Archive modes are written to jsonl:
- For Archive mode, each json line represents an entire thread dump.
- For PostStream, each json represents a single post.
Models
Refer to the following pydantic model for parsing:
class AttachmentData(pydantic.BaseModel):
attachment_url: str
filename: str
class Post(pydantic.BaseModel):
board: str
thread: int
pid:int
name: str
msg: str
attachment: Optional[AttachmentData] = None
posted:int
class Thread(pydantic.BaseModel):
title:Optional[str]
posts:list[Post]
Notices
- Archiving only text for now since I need to see how much data images would take up.
- Due to potential abuse, we are closing community posts. Contact KaraKaraWitch in other known channels.
License
Apache 2.0.