Skip to content
Prompt

evo — Cursor Rules

by VitalyOborin

AI Summary

Heuristic scoring (no AI key configured).

Install

Copy this and paste it into Claude Code, Cursor, or any AI assistant:

I want to add the "evo — Cursor Rules" prompt rules to my project.
Repository: https://github.com/VitalyOborin/evo

Please read the repo to find the rules/prompt file, then:
1. Download it to the correct location (.cursorrules, .windsurfrules, .github/prompts/, or project root — based on the file type)
2. If there's an existing rules file, merge the new rules in rather than overwriting
3. Confirm what was added

Description

Dependency Inversion Principle (DIP)

📌 5. AI/ML Use Cases

• Swappable models: Trainer doesn’t care if it’s PyTorch, TensorFlow, or scikit-learn. • Config-driven experiments: Swap models/optimizers/metrics from a YAML/JSON file. • Testing: Replace heavy models with mock models for fast unit tests. ✅ Example with config-driven injection: `yaml model: "RandomForest" metric: "Accuracy" ` `python def build_from_config(cfg): model = registry.get_model(cfg["model"]) metric = registry.get_metric(cfg["metric"]) return InferenceService(model, metric) ` ---

Dependency Inversion Principle (DIP) in Python

Definition: > High-level modules should not depend on low-level modules. Both should depend on abstractions. > Abstractions should not depend on details. Details should depend on abstractions. This prevents “hard-wiring” implementations into business logic. ---

📌 1. Classes

❌ Bad example (Trainer tightly coupled to LogisticRegression): `python class LogisticRegressionModel: def fit(self, X, y): ... def predict(self, X): ... class Trainer: def __init__(self): self.model = LogisticRegressionModel() # hard dependency def train(self, X, y): self.model.fit(X, y) ` ✅ Good example (Trainer depends on abstraction, not concrete class): `python from abc import ABC, abstractmethod class Model(ABC): @abstractmethod def fit(self, X, y): ... @abstractmethod def predict(self, X): ... class LogisticRegressionModel(Model): def fit(self, X, y): ... def predict(self, X): ... class RandomForestModel(Model): def fit(self, X, y): ... def predict(self, X): ... class Trainer: def __init__(self, model: Model): self.model = model # injected at runtime def train(self, X, y): self.model.fit(X, y) ` Now Trainer works with any Model implementation. ---

📌 2. Functions

❌ Bad example (direct dependency): `python def evaluate(model, X, y): from sklearn.metrics import accuracy_score return accuracy_score(y, model.predict(X)) ` ✅ Good example (metric injected as abstraction): `python class Metric(ABC): @abstractmethod def calculate(self, y_true, y_pred): ... class Accuracy(Metric): def calculate(self, y_true, y_pred): return (y_true == y_pred).mean() def evaluate(model, X, y, metric: Metric): return metric.calculate(y, model.predict(X)) ` ---

Discussion

0/2000
Loading comments...

Health Signals

MaintenanceCommitted 5mo ago
Stale
AdoptionUnder 100 stars
0 ★ · Niche
DocsMissing or thin
Undocumented

GitHub Signals

Issues0
Updated5mo ago
View on GitHub
No License

My Fox Den

Community Rating

Sign in to rate this booster

Works With

Any AI assistant that accepts custom rules or system prompts

Claude
ChatGPT
Cursor
Windsurf
Copilot
+ more