how to use groq as llm instead of openai llm
# ask-ai
c
how to use groq as llm instead of openai llm
g
To use Groq as an LLM instead of OpenAI, you can utilize the
openai.LLM.with_groq
method. This method allows you to create an instance of Groq LLM by setting the
GROQ_API_KEY
environment variable or passing it as an argument. You also need to specify the
base_url
as "https://api.groq.com/openai/v1". Here's a basic example:
Copy code
from livekit.plugins import openai

llm = openai.LLM.with_groq(
    model="llama3-8b-8192",
    api_key=os.environ.get("GROQ_API_KEY"),
    base_url="https://api.groq.com/openai/v1"
)
Would you like more details on setting up or using this configuration? Sources: - LiveKit OpenAI Plugin Documentation - LiveKit Slack Discussion 👍 0 button 👎 0 button 💡 Hint: Mention @runllm in the thread for followups.
c
import logging from dotenv import load_dotenv from livekit.agents import ( AutoSubscribe, JobContext, JobProcess, WorkerOptions, cli, llm, metrics, ) from livekit.agents.pipeline import VoicePipelineAgent from livekit.plugins import cartesia, openai, deepgram, silero, turn_detector load_dotenv(dotenv_path=".env.local") logger = logging.getLogger("voice-agent") def prewarm(proc: JobProcess): proc.userdata["vad"] = silero.VAD.load() async def entrypoint(ctx: JobContext): initial_ctx = llm.ChatContext().append( role="system", text=( "You are a voice assistant created by LiveKit. Your interface with users will be voice. " "You should use short and concise responses, and avoiding usage of unpronouncable punctuation. " "You were created as a demo to showcase the capabilities of LiveKit's agents framework." ), ) logger.info(f"connecting to room {ctx.room.name}") await ctx.connect(auto_subscribe=AutoSubscribe.AUDIO_ONLY) # Wait for the first participant to connect participant = await ctx.wait_for_participant() logger.info(f"starting voice assistant for participant {participant.identity}") # This project is configured to use Deepgram STT, OpenAI LLM and Cartesia TTS plugins # Other great providers exist like Cerebras, ElevenLabs, Groq, Play.ht, Rime, and more # Learn more and pick the best one for your app: # https://docs.livekit.io/agents/plugins agent = VoicePipelineAgent( vad=ctx.proc.userdata["vad"], stt=deepgram.STT(), llm=openai.LLM(model="gpt-4o-mini"), tts=cartesia.TTS(), turn_detector=turn_detector.EOUModel(), # minimum delay for endpointing, used when turn detector believes the user is done with their turn min_endpointing_delay=0.5, # maximum delay for endpointing, used when turn detector does not believe the user is done with their turn max_endpointing_delay=5.0, chat_ctx=initial_ctx, ) usage_collector = metrics.UsageCollector() @agent.on("metrics_collected") def on_metrics_collected(agent_metrics: metrics.AgentMetrics): metrics.log_metrics(agent_metrics) usage_collector.collect(agent_metrics) agent.start(ctx.room, participant) # The agent should be polite and greet the user when it joins :) await agent.say("Hey, how can I help you today?", allow_interruptions=True) if name == "__main__": cli.run_app( WorkerOptions( entrypoint_fnc=entrypoint, prewarm_fnc=prewarm, ), ) this is my current version use groq here as llm