Posts

Implement XOR Problem using Multi-Layered Perceptron

Image
XoR Problem: Below is a clear, step-by-step implementation of the XOR problem using a Multi-Layer Perceptron (MLP). I will first explain the theory, then show the mathematical steps, and finally provide a simple implementation. Step 1: Understand the XOR Problem The XOR (Exclusive OR) function outputs: x₁ x₂ XOR 0 0 0 0 1 1 1 0 1 1 1 0 Key observation : XOR is not linearly separable, so it cannot be solved by a single-layer perceptron. Hence, we need a Multi-Layer Perceptron with at least one hidden layer. Step 2: Network Architecture We choose a 2–2–1 MLP architecture : Input layer : 2 neurons (x₁, x₂) Hidden layer : 2 neurons Output layer : 1 neuron Activation function: Hidden layer → Sigmoid Output layer → Sigmoid Step 3: Initialize Parameters Let: Weights...

Implementing Single-Layer Perceptron for Binary Classification

Mathematical Formulation: For input vector  x , the perceptron computes: Linear combination:  z = w·x + b Activation:  a = σ(z)  where σ is sigmoid function Prediction:  ŷ = 1 if a ≥ 0.5 else 0 Loss: Binary cross-entropy The network learns by minimizing the loss through gradient descent, updating weights as: w = w - η * ∂L/∂w b = b - η * ∂L/∂b This implementation provides a complete, working single-layer neural network for binary classification that can learn linear decision boundaries. How to use  # Create and train perceptron perceptron = SingleLayerPerceptron(input_size=2, learning_rate=0.1, epochs=500) perceptron.fit(X_train, y_train) # Make predictions predictions = perceptron.predict(X_test) probabilities = perceptron.predict_proba(X_test) Implementation   import numpy as np import matplotlib.pyplot as plt from sklearn.datasets import make_classification from sklearn.model_selection import train_test_split from sklearn.metrics import ac...