A collaborative and supportive community at the University of Gloucestershire for students interested in machine learning and deep neural networks.
Learn MoreThe University of Gloucestershire Machine Learning Society is a community dedicated to exploring the fascinating world of neural networks and deep learning. We provide a supportive environment where students can learn, collaborate, and discover their passion within this rapidly evolving field.
Our bi-weekly sessions cover topics ranging from the fundamentals of neural networks to advanced concepts in transformers, RNNs, CNNs, and practical skills like effective prompting. Led by our resident researcher specializing in mechanistic interpretability and data efficiency, our meetings balance theoretical knowledge with practical applications.
Whether you're just beginning your journey in machine learning or looking to deepen your expertise, our society offers resources, mentorship, and a network of like-minded individuals who share your curiosity and enthusiasm.
Join Our CommunityThe fundamental building blocks of neural networks, perceptrons are inspired by biological neurons. They take multiple inputs, apply weights, and produce an output through an activation function. Understanding perceptrons is essential for grasping how neural networks learn patterns in data.
This hyperparameter controls how much a model changes in response to error. Too high, and the model may overshoot optimal solutions; too low, and training becomes inefficient. Finding the right learning rate is crucial for effective training and is often one of the most important parameters to tune.
Loss functions measure the difference between a model's predictions and actual values. By minimizing this function, neural networks improve their accuracy. Different problems require different loss functions, such as mean squared error for regression or cross-entropy for classification.
The algorithm that powers neural network learning, backpropagation calculates gradients of the loss function with respect to weights. It efficiently propagates error information backward through the network, enabling weight updates that improve performance over time.
These functions determine the output of neurons based on inputs. Non-linear activation functions like ReLU, sigmoid, and tanh allow neural networks to learn complex patterns and relationships that simple linear models cannot capture.
Overfitting occurs when models perform well on training data but poorly on new data. Regularization techniques like dropout, L1/L2 regularization, and early stopping help combat this problem by constraining model complexity and promoting generalization.
Experience neural networks in action with our interactive multiplication network visualization. Train a simple neural network to learn multiplication and observe how it processes inputs, adjusts weights, and improves its predictions.
Launch Interactive NetworkReady to explore the fascinating world of neural networks and deep learning? Join our community of like-minded students and researchers at the University of Gloucestershire.
We meet every two weeks for sessions covering various aspects of machine learning, from basic concepts to advanced topics. No prior experience is necessary—just bring your curiosity!
Official Society Page: uogsu.com/society/16083/
Discord Community: discord.gg/HKd9QjF5Vz