Skip to content

The system takes a brief textual description, prompt, or statement from the user and analyzes it to determine if it correctly expresses a prophetic or visionary statement in the perfect tense. Using p

Notifications You must be signed in to change notification settings

chigwell/prophecyperfect

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

ProphecyPerfect

PyPI version License: MIT Downloads LinkedIn

A Python package for analyzing and verifying if textual statements correctly express prophetic or visionary statements in the perfect tense.

Installation

pip install prophecyperfect

Overview

The prophecyperfect package takes a brief textual description, prompt, or statement from the user and analyzes it to determine if it correctly expresses a prophetic or visionary statement in the perfect tense. Using pattern matching and structured responses, it classifies the input as prophetic, reaffirming its status, or identifies inaccuracies or non-prophetic phrasing.

Usage

Basic Usage

from prophecyperfect import prophecyperfect

# Analyze a prophetic statement
user_input = "Thus saith the Lord: I will surely bless you, and I will multiply you exceedingly."
response = prophecyperfect(user_input)
print(response)

Using Custom LLM

You can also use your own LLM instance from LangChain. Here are examples with different providers:

OpenAI

from langchain_openai import ChatOpenAI
from prophecyperfect import prophecyperfect

llm = ChatOpenAI()
response = prophecyperfect(user_input, llm=llm)

Anthropic

from langchain_anthropic import ChatAnthropic
from prophecyperfect import prophecyperfect

llm = ChatAnthropic()
response = prophecyperfect(user_input, llm=llm)

Google

from langchain_google_genai import ChatGoogleGenerativeAI
from prophecyperfect import prophecyperfect

llm = ChatGoogleGenerativeAI()
response = prophecyperfect(user_input, llm=llm)

API Key Configuration

The default rate limits for LLM7 free tier are sufficient for most use cases. If you need higher rate limits, you can:

  1. Set the API key as an environment variable:
export LLM7_API_KEY="your_api_key_here"
  1. Or pass it directly to the function:
response = prophecyperfect(user_input, api_key="your_api_key_here")

You can obtain a free API key by registering at https://token.llm7.io/

Parameters

  • user_input (str): The user input text to process
  • llm (Optional[BaseChatModel]): The LangChain LLM instance to use. If not provided, the default ChatLLM7 will be used.
  • api_key (Optional[str]): The API key for LLM7. If not provided, it will use the environment variable LLM7_API_KEY or the default free tier.

Author

Issues

For any issues or to contribute, please visit the GitHub repository: https://github.com/chigwell/prophecyperfect