✂️Text Splitters
When you want to deal with long pieces of text, it is necessary to split up that text into chunks.
Last updated
When you want to deal with long pieces of text, it is necessary to split up that text into chunks.
Last updated
As simple as this sounds, there is a lot of potential complexity here. Ideally, you want to keep the semantically related pieces of text together. What "semantically related" means could depend on the type of text. This notebook showcases several ways to do that.
At a high level, text splitters work as following:
1. Split the text up into small, semantically meaningful chunks (often sentences).
2. Start combining these small chunks into a larger chunk until you reach a certain size (as measured by some function).
3. Once you reach that size, make that chunk its own piece of text and then start creating a new chunk of text with some overlap (to keep context between chunks).
That means there are two different axes along which you can customize your text splitter:
1. How the text is split
2. How the chunk size is measured
Splits only on one type of character (defaults to "\n\n").
Split documents based on language-specific syntax.
Converts Html to Markdown and then split your content into documents based on the Markdown headers.
Split your content into documents based on the Markdown headers.
Split documents recursively by different characters - starting with "\n\n", then "\n", then " ".
Splits a raw text string by first converting the text into BPE tokens, then split these tokens into chunks and convert the tokens within a single chunk back into text.
7)Recursive JSON Text Splitter
The Recursive JSON Text Splitter operates by traversing the JSON data recursively, ensuring that each chunk adheres to specified size constraints. It maintains the hierarchical structure of the JSON, which is crucial for preserving the context and relationships within the data.
Recursive Traversal: The splitter navigates through nested JSON objects and arrays, breaking them down into smaller pieces while preserving their structure.
Chunk Size Configuration:
Maximum Chunk Size: Defines the upper limit for the size of each chunk.
Minimum Chunk Size: Ensures that chunks are not too small, which could lead to loss of context.
List Conversion: Optionally converts lists into dictionaries to facilitate more effective chunking.
Output Formats:
Structured JSON Chunks: Returns a list of smaller JSON objects.
JSON-Formatted Strings: Provides chunks as JSON strings, suitable for text-based processing.