If you are using crew.ai and want to use a different LLM, one of the options that is not covered well in the documentation is how to use Amazon Bedrock.

in the crew examples like the one I was testing locally Stock analysis the readme has instructions on how to replace OpenAI with OLLAMA but the notes look to be copied from another example and don’t cover much else.

but you can see from that, it is making use of langchain to handle the OLLAMA swap. And it happens that langchain already has support for bedrock, so the swap to point to Amazon bedrock is not too tough.

You will need to have aws cli and BOTO3 installed, and you will need to have a set of credentials in your profile that supports access to Amazon bedrock and have access to the models you are looking to target enabled.

First, import the Bedrock namespace from langchain and define an llm variable that uses it. Use the credentials name I described above.

from langchain_community.llms import Bedrock
 
llm = Bedrock(
	credentials_profile_name="default", model_id="anthropic.claude-v2:1"
	)

And add a reference to that LLM object in all the agent definitions that call default LLM

 
def research_coordinator(self):
 
return Agent(
	role='Staff Research Analyst',
	goal="""Being the best at gather, interpret data and amaze
	your customer with it""",
	backstory="""Known as the BEST research analyst, you're
	skilled in sifting through news, company announcements,
	and market sentiments. Now you're working on a super
	important customer""",
	verbose=True,
	llm=llm,
	tools=[
		BrowserTools.scrape_and_summarize_website,
		SearchTools.search_internet,
		SearchTools.search_news,
		YahooFinanceNewsTool(),
		SECTools.search_10q,
		SECTools.search_10k
	]
 
)

at that point you should be able to run main.py and get it to submit requests to Amazon bedrock. you will still see an issue being thrown because the format for the prompt that the model I chose (claud.ai) does not match the prompt format of the default OpenAi LLM.

the error will look something like this:

  warnings.warn(ALTERNATION_ERROR + f" Received {input_text}")
^CTraceback (most recent call last):

You will need to adjust the prompt to include indicators like “HUMAN:” but once that is done you should be up and running your examples with Amazon Bedrock powering your app.