Logistic Map Approximator (Neural Network)

This model approximates the logistic map equation:

xₙ₊₁ = r × xₙ × (1 − xₙ)

It is trained using a simple feedforward neural network to learn chaotic dynamics across different values of r ∈ [2.5, 4.0].

Model Details

  • Framework: PyTorch
  • Input:
    • x ∈ [0, 1]
    • r ∈ [2.5, 4.0]
  • Output: x_next (approximation of the next value in sequence)
  • Loss Function: Mean Squared Error (MSE)
  • Architecture: 2 hidden layers (ReLU), trained for 100 epochs

Performance

The model closely approximates x_next for a wide range of r values, including the chaotic regime.

Files

  • logistic_map_approximator.pth: Trained PyTorch model weights
  • mandelbrot.py: Full training and evaluation code
  • README.md: You're reading it
  • example_plot.png: Comparison of true vs predicted outputs

Applications

  • Chaos theory visualizations
  • Educational tools on non-linear dynamics
  • Function approximation benchmarking

License

MIT License

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support