Implementing Single-Layer Perceptron for Binary Classification
Mathematical Formulation: For input vector x , the perceptron computes: Linear combination: z = w·x + b Activation: a = σ(z) where σ is sigmoid function Prediction: ŷ = 1 if a ≥ 0.5 else 0 Loss: Binary cross-entropy The network learns by minimizing the loss through gradient descent, updating weights as: w = w - η * ∂L/∂w b = b - η * ∂L/∂b This implementation provides a complete, working single-layer neural network for binary classification that can learn linear decision boundaries. How to use # Create and train perceptron perceptron = SingleLayerPerceptron(input_size=2, learning_rate=0.1, epochs=500) perceptron.fit(X_train, y_train) # Make predictions predictions = perceptron.predict(X_test) probabilities = perceptron.predict_proba(X_test) Implementation import numpy as np import matplotlib.pyplot as plt from sklearn.datasets import make_classification from sklearn.model_selection import train_test_split from sklearn.metrics import ac...