Blog
Articles about computational science and data science, neuroscience, and open source solutions. Personal stories are filed under Weekend Stories. Browse all topics here. All posts are CC BY-NC-SA licensed unless otherwise stated. Feel free to share, remix, and adapt the content as long as you give appropriate credit and distribute your contributions under the same license.
tags · RSS · Mastodon · simple view · page 3/15
Hodgkin-Huxley model
An important step beyond simplified neuronal models is the Hodgkin-Huxley model. This model is based on the experimental data of Hodgkin and Huxley, who received the Nobel Prize in 1963 for their groundbreaking work. The model describes the dynamics of the membrane potential of a neuron by incorporating biophysiological properties instead of phenomenological descriptions. It is a cornerstone of computational neuroscience and has been used to study the dynamics of action potentials in neurons and the behavior of neural networks. In this post, we derive the Hodgkin-Huxley model step by step and provide a simple Python implementation.
FitzHugh-Nagumo model
In the previous post, we analyzed the dynamics of Van der Pol oscillator by using phase plane analysis. In this post, we will see, that this oscillator can be considered as a special case of another dynamical system, the FitzHugh-Nagumo model. The FitzHugh-Nagumo model is a simplified model used to describe the dynamics of the action potential in neurons. With a few modifications of the Van der Pol equations we can obtain the model’s ODE system. By again using phase plane analysis, we can then investigate how the dynamics of the system changes under these modifications.
Van der Pol oscillator
In this post, we will apply phase plane analysis to the Van der Pol oscillator. The Van der Pol oscillator is a non-conservative oscillator with nonlinear damping, which was first described by the Dutch electrical engineer Balthasar van der Pol in 1920. We will explore how phase plane analysis can be used to gain insights into the behavior of this system and how it can be used to predict its long-term behavior.
Nullclines and fixed points of the Rössler attractor
After introducing phase plane analysis in the previous post, we will now apply this method to the Rössler attractor presented earlier. We will investigate the system’s nullclines and fixed points, and analyze the attractor’s dynamics in the phase space.
Using phase plane analysis to understand dynamical systems
When it comes to understanding the behavior of dynamical systems, it can quickly become too complex to analyze the system’s behavior directly from its differential equations. In such cases, phase plane analysis can be a powerful tool to gain insights into the system’s behavior. This method allows us to visualize the system’s dynamics in phase portraits, providing a clear and intuitive representation of the system’s behavior. Here, we explore how we can use this method and exemplarily apply it to the simple pendulum.
PyTorch on Apple Silicon
Already some time ago, PyTorch became fully available for Apple Silicon. It’s no longer necessary to install the nightly builds to run PyTorch on the GPU of your Apple Silicon machine as I described in one of my earlier posts.
Rössler attractor
Unlike the Lorenz attractor which emerges from the dynamics of convection rolls, the Rössler attractor does not describe a physical system found in nature. Instead, it is a mathematical construction designed to illustrate and study the behavior of chaotic systems in a simpler, more accessible manner. In this post, we explore how we can quickly simulate this strange attractor using simple Python code.
Understanding Hebbian learning in Hopfield networks
Hopfield networks, a form of recurrent neural network (RNN), serve as a fundamental model for understanding associative memory and pattern recognition in computational neuroscience. Central to the operation of Hopfield networks is the Hebbian learning rule, an idea encapsulated by the maxim ‘neurons that fire together, wire together’. In this post, we explore the mathematical underpinnings of Hebbian learning within Hopfield networks, emphasizing its role in pattern recognition.
Building a neural network from scratch using NumPy
Ever thought about building you own neural network from scratch by simply using NumPy? In this post, we will do exactly that. We will build, from scratch, a simple feedforward neural network and train it on the MNIST dataset.
Python’s version logos
Have you ever noticed that Python has introduced individual version logos starting with version 3.10? I couldn’t find any official announcement, but luckily, the Python community on Mastodon was able to help out.