AmpleAI Plugin is an Amplenote plugin that adds OpenAI & Ollama interactivity with Amplenote.
- Clone this repo.
git clone git@github.com:alloy-org/openai-plugin.git - Install node and npm if you haven't already.
- Run
npm installto install the packages. - Copy
.env.exampleto.envand fill in the environment variable for your OpenAI key
To run a specific test file, use: NODE_OPTIONS=--experimental-vm-modules npm test -- test/plugin.test.js
For Cursor & LLMs to invoke a test file, the following needs to run OUTSIDE SANDBOX MODE, otherwise the tests will fail with "Error: Cannot find module '@jest/test-sequencer'"
cd /Users/bill/src/ai-plugin && NODE_OPTIONS=--experimental-vm-modules npm test -- test/path-to-file.test.jsAnd to invoke a test file while only running the test whose name includes "llama":
cd /Users/bill/src/ai-plugin && NODE_OPTIONS=--experimental-vm-modules npm test -- test/path-to-file.test.js -t "llama"Or to run allll the tests, NODE_OPTIONS=--experimental-vm-modules npm test
npm test -- -t "should allow appOption freeform Q&A" test/plugin.test.jsIf you don't have Ollama running locally, you can skip the local LLM tests (which test Mistral and other Ollama models) by setting the LOCAL_MODELS environment variable to suspended:
LOCAL_MODELS=suspended NODE_OPTIONS=--experimental-vm-modules npm testThis will prevent test failures from local model tests when Ollama is not running.
https://public.amplenote.com/F4rghypGZSXEjjFLiXQTxxcR
NODE_OPTIONS=--experimental-vm-modules npm run test -- --watch- https://esbuild.github.io/getting-started/#your-first-bundle
- https://jestjs.io/
- https://www.gitclear.com
OLLAMA_ORIGINS=https://plugins.amplenote.com ollama serve
