Example solution: `/explore`, and `/recommend` endpoints
Here are example solutions for the /explore and /recommend endpoints.
Endpoint 4: /explore - Genre-Based Discovery
Example Solution
@app.get("/explore", response_model=ExplorerResponse)
def explore_movies(
genre: str = Query(..., description="Movie genre to explore"),
year_min: Optional[int] = Query(
None, description="Filter by release year - from this year"
),
year_max: Optional[int] = Query(
None, description="Filter by release year - to this year"
),
):
"""
Explore movies by genre(s) and optional year
- Returns most popular movies best matching the specified genre
- Can filter by year
- Sorted by popularity/rating
"""
try:
with connect_to_weaviate() as client:
movies = client.collections.use(CollectionName.MOVIES)
if year_min and year_max:
filters = (
Filter.by_property("year").greater_or_equal(year_min)
& Filter.by_property("year").less_or_equal(year_max)
)
elif year_min:
filters = Filter.by_property("year").greater_or_equal(year_min)
elif year_max:
filters = Filter.by_property("year").less_or_equal(year_max)
else:
filters = None
response = movies.query.hybrid(
query=genre,
target_vector="genres",
filters=filters,
limit=PAGE_SIZE,
)
sorted_movies = sorted(
[o.properties for o in response.objects],
key=lambda x: x["popularity"],
reverse=True,
)
return ExplorerResponse(
movies=sorted_movies,
genre=genre,
year_min=year_min,
year_max=year_max,
)
except Exception as e:
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
Key Concepts Explained
Targeted vectors: Using target_vector="genres" searches genre-specific embeddings. This vector was created from the genre tags specifically, the vector search evaluates similarity between the input genre and the stored genre embeddings.
Filters: The year filtering logic is identical to the search endpoint. Hopefully you've internalized this pattern now!
Post-processing: After getting vector search results, we sort by popularity to surface the most well-known movies in each genre. This means that the vector search results are not used to determine rankings; instead, vector search works as a filter of sorts, and the object data (e.g., popularity) is used for final ranking.
This will produce results that are semantically relevant to the genre but ranked by popularity within that relevance.
Endpoint 5: /recommend - AI-Powered Recommendations
Example Solution
@app.get("/recommend", response_model=RecommendationResponse)
def recommend_movie(
occasion: str = Query(..., description="Viewing occasion (e.g., 'date night', 'family movie')")
):
"""
Get movie recommendations based on viewing occasion
- Generates a query string from occasion
- Performs semantic search against movie descriptions
- Returns best match with reasoning
"""
try:
query_string = movie_occasion_to_query(occasion=occasion)
full_task_prompt = f"""
The user is interested in movie recommendations for this occasion:
========== OCCASION INPUT FROM USER ==========
{occasion}
========== END INPUT ==========
Out of these movies, recommend 2-4 suitable movies, and describe why, so the user can choose for themselves.
IMPORTANT: Only include the recommendation text in your response and nothing else.
"""
with connect_to_weaviate() as client:
movies = client.collections.use(CollectionName.MOVIES)
response = movies.generate.near_text(
query=query_string,
target_vector="default",
limit=PAGE_SIZE,
grouped_task=full_task_prompt,
generative_provider=GenerativeConfig.anthropic(
model="claude-3-5-haiku-latest"
),
)
return RecommendationResponse(
recommendation=response.generative.text,
query_string=query_string,
movies_considered=[o.properties for o in response.objects],
occasion=occasion
)
except Exception as e:
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
Key Concepts Explained
RAG (Retrieval Augmented Generation): This endpoint demonstrates the complete RAG pattern:
- Retrieval: Find movies semantically related to the occasion using
near_text - Augmentation: Provide those movies as context to the AI model
- Generation: Let the AI analyze and recommend specific movies with explanations.
Occasion-to-query conversion: The movie_occasion_to_query() helper converts natural language occasions into search-optimized queries. For example:
- "date night" → "romantic intimate couple evening"
- "family movie" → "family friendly children entertainment"
Prompt engineering: The structured prompt ensures the AI focuses on the specific movies found and provides reasoning, rather than general recommendations.
Semantic search: Using target_vector="default" leverages the general content vectors (title + overview) to find movies that match the conceptual meaning of the occasion.
Generative configuration: Allows us to select a specific AI Model at query time, independently of the collection configuration. This can easily be changed at the application level without modifying the underlying database or data model.
Congratulations on implementing all the endpoints! Let's reflect on what you've accomplished and how these patterns apply to real-world AI applications.