Project on Artificial Neural Network
A Simple Example of Neural Network in Python
This code provides a simple introduction to creating and training a neural network for a binary classification problem. It demonstrates the basic structure of a neural network and how it can be used to solve a simple problem like XOR.
Before running this code, make sure you have TensorFlow installed. You can install it using
pip install tensorflow.
# Step 1: Import the necessary libraries
import tensorflow as tf
import numpy as np
# Step 2: Create a simple dataset (XOR problem)
# XOR problem: We want the neural network to learn XOR logic
# Input data
X = np.array([[0, 0], [0, 1], [1, 0], [1, 1]], dtype=np.float32)
# Output data
y = np.array([[0], [1], [1], [0]], dtype=np.float32)
# Step 3: Build the neural network model
model = tf.keras.Sequential([
tf.keras.layers.Input(shape=(2,)), # Input layer with 2 features
tf.keras.layers.Dense(2, activation='relu'), # Hidden layer with 2 neurons and ReLU activation
tf.keras.layers.Dense(1, activation='sigmoid') # Output layer with 1 neuron and sigmoid activation
])
# Step 4: Compile the model
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
# Step 5: Train the model
history = model.fit(X, y, epochs=1000, verbose=0) # Training for 1000 epochs (iterations)
# Step 6: Evaluate the model
loss, accuracy = model.evaluate(X, y)
print(f"Model loss: {loss:.4f}")
print(f"Model accuracy: {accuracy*100:.2f}%")
# Step 7: Make predictions
predictions = model.predict(X)
rounded_predictions = np.round(predictions)
print("Predictions:")
for i in range(len(X)):
print(f"Input: {X[i]}, Predicted Output: {rounded_predictions[i][0]}")
Explanation:
1. We import the necessary libraries, including TensorFlow and NumPy.
2. We create a simple dataset for the XOR problem. XOR is a binary logic gate where the output is 1 if the number of 1s in the input is odd. We define the input data X and the corresponding output data y.
3. We build a neural network model using TensorFlow's Sequential API. This model consists of an input layer, a hidden layer with ReLU activation, and an output layer with sigmoid activation.
4. We compile the model, specifying the optimizer ('adam') and the loss function ('binary_crossentropy').
5. We train the model on the XOR dataset for 1000 epochs (iterations) using the fit method.
6. We evaluate the model's performance using the evaluate method, calculating the loss and accuracy.
7. Finally, we make predictions using the trained model and round the predictions to 0 or 1. We display the input and predicted output for each example in the dataset.