📊 Machine Learning
Traditional ML algorithms, supervised and unsupervised learning techniques
🎯 CrossEntropy Loss
Fundamental loss function for classification problems. Covers mathematical foundation, MLE derivation, and practical implementations.
Available📊 MSE Loss
Mean Squared Error loss function for regression problems. Mathematical foundation, properties, and comparison with other losses.
Available📈 Linear Regression
Foundation of machine learning - understanding linear relationships, least squares method, and gradient descent optimization.
Available📊 Logistic Regression
Binary classification using sigmoid function. MLE derivation, cross-entropy loss, and decision boundaries.
Available🎯 Softmax Regression
Multiclass classification using softmax function. Extension of logistic regression to multiple classes.
Available🚀 Optimizer Visualization
Interactive visualization of neural network optimization algorithms including SGD, Momentum, AdaGrad, Adam, RMSprop, and AdaDelta.
Available🌳 Decision Trees
Tree-based algorithms for both classification and regression problems.
Coming Soon🎯 Clustering
Unsupervised learning techniques including K-means, hierarchical clustering, and DBSCAN.
Coming Soon📊 Model Evaluation
Metrics, cross-validation, and techniques for evaluating machine learning models.
Coming Soon🔧 Feature Engineering
Techniques for creating, selecting, and transforming features for better model performance.
Coming Soon