Denken
Menu
  • About Me
  • Deep Learning with Pytorch
  • Generative AI: Tutorial Series
  • Python Tutorials
  • Contact Me
Menu

Deep Learning with Pytorch – Neural Network Implementation – 1.1

Posted on January 26, 2019May 20, 2019 by Aritra Sen

In the last post of this series , we got the basic ideas of Pytorch and how to use few of the features of the Pytorch. In this tutorial we will go through how to code a deep neural network. First we will go through, what are the basic steps required in neural network and then we will see how to implement the same in Pytorch fashion and taking advantage of PyTorch’s nn classes to make the code more concise and flexible.

First , lets go through what are steps are required in Neural Network Implementation –

  • Initialize weights
  • Do forward propagation
  • Calculate loss
  • Calculate derivatives (backward propagation)
  • Update the weights (backward propagation)
  • Keep on doing previous steps until you reach global minimum through Gradient Descent.

In the previous tutorial he have seen that Pytorch has the in built Auto-grad function to calculate gradients. Let’s see how using this autograd features we can code a simple one layer neural network from scratch  in the below shown way.

Now let’s do this using Pytorch in built functionalities –

Do like , share and comment if you have any questions.

Category: Machine Learning, Python

Post navigation

← Deep Learning with Pytorch – Getting Started 1.0
Deep Learning with Pytorch-DataLoader,Validation&Test,Dropouts – 1.2 →

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RSS Feeds

Enter your email address:

Delivered by FeedBurner

Pages

  • About Me
  • Contact Me
  • Deep Learning with Pytorch
  • Generative AI: Tutorial Series
  • Python Tutorials

Tag Cloud

Announcements Anrdoid BERT Bias Celebration Cricket CyanogenMod deep-learning Denken Experience Facebook Features Finetuning GCN GenerativeAI GNN Google HBOOT HBOOT downgrading HTC Wildfire huggingface India Launch Life LLM Lumia 520 MachineLearning mobile My Space nlp Orkut People Python pytorch pytorch-geometric Rooting Sachin Share Social Network tranformers transformers Tutorials Twitter weight-initialization Windows Phone

WP Cumulus Flash tag cloud by Roy Tanck and Luke Morton requires Flash Player 9 or better.

Categories

Random Posts

  • 1.0 – Getting started with Transformers for NLP
  • #SachinIsGod
  • Voyage
  • Deep Learning with Pytorch -Sequence Modeling – LSTMs – 3.2
  • Are we Unbiased?

Recent Comments

  • Generative AI: LLMs: Reduce Hallucinations with Retrieval-Augmented-Generation (RAG) 1.8 – Denken on Generative AI: LLMs: Semantic Search and Conversation Retrieval QA using Vector Store and LangChain 1.7
  • vikas on Domain Fuss
  • Kajal on Deep Learning with Pytorch -Text Generation – LSTMs – 3.3
  • Aritra Sen on Python Tutorials – 1.1 – Variables and Data Types
  • Aakash on Python Tutorials – 1.1 – Variables and Data Types

Visitors Count

AmazingCounters.com

Archives

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

Copyright

AritraSen’s site© This site has been protected from copyright by copyscape.Copying from this site is stricktly prohibited. Protected by Copyscape Original Content Validator
© 2025 Denken | Powered by Minimalist Blog WordPress Theme