Skip to content

Vercel Long Running task (edge function) #1151

@Skn0tt

Description

@Skn0tt

Discussed in #1150

Originally posted by nilooy June 25, 2023
is it possible to use quirrel queue with vercel edge function? i was looking specifically for this to run as background job by quirrel
https://github.com/inngest/vercel-ai-sdk/blob/main/examples/next-openai/app/api/chat/route.ts

i tried the following approach

import { Queue as TestQueue } from "quirrel/next";
import { Configuration, OpenAIApi } from "openai-edge";
import { OpenAIStream, StreamingTextResponse } from "ai";

export const runtime = "edge";

const config = new Configuration({
  apiKey: process.env.OPENAI_API_KEY,
});
const openai = new OpenAIApi(config);

// @ts-ignore
export default TestQueue("api/test", async (params) => {
  const response = await openai.createChatCompletion({
    model: "gpt-3.5-turbo",
    stream: true,
    messages: [{ role: "user", content: "explain the next js" }],
  });

  const stream = OpenAIStream(response);
  // Respond with the stream
  return new StreamingTextResponse(stream);
});

and ran from another route

await TestQueue.enqueue({ test: 123 });

this results in following error while running

👟Executing job
  queue: /api/test
     id: 7f0226c0-4824-4671-9efa-e926484e95ae
   body: {"test":123}
error - node_modules/quirrel/dist/esm/src/client/enhanced-json.js (13:0) @ Module.parse
error - Unexpected token o in JSON at position 1
null

WORKED PERFECTLY WITHOUT edge

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions