keyinfostruct is a lightweight Python package that extracts and structures key information from personal experience narratives (e.g., social‑media posts, messages).
It automatically identifies the main points, emotions, and suggested actions from free‑form text using an LLM and returns the data in a ready‑to‑use list.
- One‑function API – just call
keyinfostruct(...). - Works out‑of‑the‑box with ChatLLM7 (default LLM).
- Plug‑and‑play with any LangChain‑compatible chat model (OpenAI, Anthropic, Google, etc.).
- Returns data that matches a strict regular‑expression pattern, guaranteeing predictable output.
pip install keyinfostructfrom keyinfostruct import keyinfostruct
user_input = """
I just posted on Instagram that I'm getting divorced.
Now I see a bot impersonating me and replying to my friends with weird messages.
I'm feeling angry and scared.
What should I do?
"""
# Use the default ChatLLM7 (API key is read from env LLM7_API_KEY)
response = keyinfostruct(user_input)
print(response)
# Example output:
# [
# "Event: announced divorce on Instagram",
# "Emotion: angry, scared",
# "Suggested Action: report impersonation, change passwords, inform friends"
# ]| Parameter | Type | Description |
|---|---|---|
user_input |
str |
The raw text you want to analyse. |
llm (optional) |
BaseChatModel |
Any LangChain chat model. If omitted, the package instantiates ChatLLM7 automatically. |
api_key (optional) |
str |
LLM7 API key. If not supplied, the function reads the environment variable LLM7_API_KEY. If that is also missing, a placeholder "None" is used (the request will fail unless you provide a real key). |
You can swap the default model for any LangChain‑compatible chat model.
from langchain_openai import ChatOpenAI
from keyinfostruct import keyinfostruct
llm = ChatOpenAI(model="gpt-4o-mini")
response = keyinfostruct(user_input, llm=llm)from langchain_anthropic import ChatAnthropic
from keyinfostruct import keyinfostruct
llm = ChatAnthropic(model="claude-3-haiku-20240307")
response = keyinfostruct(user_input, llm=llm)from langchain_google_genai import ChatGoogleGenerativeAI
from keyinfostruct import keyinfostruct
llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash")
response = keyinfostruct(user_input, llm=llm)- Free tier: sufficient for most development and small‑scale usage.
- Higher limits: obtain an upgraded key from the LLM7 dashboard.
Set the key via environment variable:
export LLM7_API_KEY="your_llm7_api_key"Or pass it directly:
response = keyinfostruct(user_input, api_key="your_llm7_api_key")Free keys are available after registration at https://token.llm7.io/.
keyinfostruct builds a LangChain message chain consisting of a system prompt, a human prompt (your user_input), and a regex pattern defined in keyinfostruct.prompts.
The helper llmatch runs the LLM, validates the output against the pattern, and returns the extracted list. If the LLM response does not satisfy the pattern, a RuntimeError is raised.
- Issues & feature requests: https://github.com/chigwell/keyinfostruct/issues
- Pull requests are welcome—please follow the repository’s contribution guidelines.
This project is licensed under the MIT License.
Eugene Evstafev
Email: hi@euegne.plus
GitHub: chigwell
Happy structuring!