Skip to content
Prompt

lemcs — Cursor Rules

by medelman17

AI Summary

FastAPI development guidelines for building legal document processing endpoints with async patterns and modular architecture. Developers working on document-intensive APIs in Cursor will benefit from these standardized patterns and endpoint structures.

Install

Copy this and paste it into Claude Code, Cursor, or any AI assistant:

I want to add the "lemcs — Cursor Rules" prompt rules to my project.
Repository: https://github.com/medelman17/lemcs

Please read the repo to find the rules/prompt file, then:
1. Download it to the correct location (.cursorrules, .windsurfrules, .github/prompts/, or project root — based on the file type)
2. If there's an existing rules file, merge the new rules in rather than overwriting
3. Confirm what was added

Description

FastAPI development guidelines for legal document processing endpoints

Example usage in endpoint

@router.get("/nlp-status") async def get_nlp_status( nlp_service: HybridLegalNLP = Depends(get_nlp_service) ) -> Dict[str, Any]: """Get status of NLP services.""" status = nlp_service.get_service_status() return { "nlp_status": status, "model_info": nlp_service.bert_service.get_model_info() } `

API Architecture

The LeMCS API provides endpoints for legal document processing, analysis, and consolidation using FastAPI with async patterns.

Core API Structure

` api/ ├── routes/ │ ├── documents.py # Document upload and processing │ ├── citations.py # Citation extraction endpoints │ ├── consolidation.py # Document consolidation │ ├── health.py # Health check endpoints │ └── documents_simple.py # Simplified document endpoints `

Document Processing Endpoint

`python from fastapi import APIRouter, UploadFile, File, Depends, HTTPException from nlp.hybrid_legal_nlp import HybridLegalNLP from typing import Dict, Any import asyncio router = APIRouter(prefix="/documents", tags=["documents"]) @router.post("/analyze") async def analyze_legal_document( document: UploadFile = File(...), nlp_service: HybridLegalNLP = Depends(get_hybrid_nlp) ) -> Dict[str, Any]: """ Analyze uploaded legal document with comprehensive NLP processing. Returns: • Document type classification • Entity extraction results • Legal concept analysis • Confidence scores """ try: # Validate file type if not document.content_type.startswith('text/'): raise HTTPException( status_code=400, detail="Only text files are supported" ) # Extract text content content = await document.read() text = content.decode('utf-8') # Process with legal NLP loop = asyncio.get_event_loop() analysis = await loop.run_in_executor( None, nlp_service.comprehensive_analysis, text ) return { "status": "success", "filename": document.filename, "document_type": analysis["document_analysis"]["type"], "confidence": analysis["document_analysis"]["confidence"], "analysis": analysis } except UnicodeDecodeError: raise HTTPException( status_code=400, detail="File encoding not supported. Please use UTF-8." ) except Exception as e: raise HTTPException( status_code=500, detail=f"Document analysis failed: {str(e)}" ) `

Discussion

0/2000
Loading comments...

Health Signals

MaintenanceCommitted 10mo ago
Stale
AdoptionUnder 100 stars
0 ★ · Niche
DocsMissing or thin
Undocumented

GitHub Signals

Issues0
Updated10mo ago
View on GitHub
No License

My Fox Den

Community Rating

Sign in to rate this booster

Works With

Any AI assistant that accepts custom rules or system prompts

Claude
ChatGPT
Cursor
Windsurf
Copilot
+ more