Programmation Orientée Objet en Python
  • Back to Main Website
  • Home
  • Introduction: Histoire et Concepts
    • Introduction: Histoire et Concepts
    • Histoire de la programmation
    • Première Structuration des données
    • Naissance de la POO
    • Python: tout n’est qu’objet
    • Python : Simplicité des objets et performance sous-jacente
    • Classes en Python : Concepts fondamentaux

    • Travaux Pratiques
    • Construire sa propre Liste
    • Construire son propre DataFrame
  • Encapsulation, Héritage, Composition et Dunder
    • Encapsulation, Heritage, Composition et Dunder
    • Encapsulation en Python
    • Héritage en Python : Concept et intérêt
    • Héritage vs Composition
    • Méthodes Dunder en Python
    • Python call Method: A Fun Exploration

    • Travaux Pratiques
    • TP: Heritage avec le pricing d’option
    • TP : Ajouter des méthodes dunder à DataFrameSimple
    • TP : Étendre la classe Liste avec des méthodes dunder
    • TP: Dunder Method with Tensor for Automatic Differentiation
  • Polymorphisme et Surcharge
    • Polymorphisme et Surcharge
    • Polymorphism in Object-Oriented Programming
    • Polymorphism in Python: Function Overloading and Type Checking
    • Class Creation: Standard vs type()
    • Type Hinting, Typing Module, and Linters in Python
    • Abstract Classes
    • Protocol Classes

    • Travaux Pratiques
    • TP
  • Decorators
    • Design Patterns
    • The decorator pattern
    • Decorator Practically
    • Built-in Decorators and Standard Library Decorators in Python
    • Practical Decorators in Python Libraries

    • Travaux Pratiques
    • TP: Monte Carlo Option Pricing with Decorators
    • TP: Optimizing Heston Model Monte Carlo Simulation
  • Project Management and Packaging
    • Project and Package
    • Organizing Python Projects
    • Understanding imports
    • Python Package Management and Virtual Environments
    • Unit Testing in Python

    • Travaux Pratiques
    • TP: Creating a Linear Regression Package
  • Design Patterns
    • OOP Design Patterns
    • Python-Specific Design Patterns
    • Creation Design Patterns
    • Structural Design Patterns
    • Behavioral Design Pattern

    • Travaux Pratiques
    • TP
  • Sujets de Projets possibles
    • Projets
    • Projets POO - 2024-2025
  • Code source
  1. Travaux Pratiques
  2. TP: Dunder Method with Tensor for Automatic Differentiation
  • Encapsulation, Heritage, Composition et Dunder
  • Encapsulation en Python
  • Héritage en Python : Concept et intérêt
  • Héritage vs Composition
  • Méthodes Dunder en Python
  • Python call Method: A Fun Exploration
  • Travaux Pratiques
    • TP: Heritage avec le pricing d’option
    • TP : Ajouter des méthodes dunder à DataFrameSimple
    • TP : Étendre la classe Liste avec des méthodes dunder
    • TP: Dunder Method with Tensor for Automatic Differentiation

On this page

  • Understanding Tensors and Automatic Differentiation
    • The Power of Composition
    • Automatic Differentiation: It’s All About the Chain Rule
    • Tensors: Tracking Operations and Their Derivatives
  • TP Instructions: Implementing a Basic Tensor Class with Automatic Differentiation
  • Example Usage: Computing Gradients of a Complex Function
  • Unitary Tests
  1. Travaux Pratiques
  2. TP: Dunder Method with Tensor for Automatic Differentiation

TP: Dunder Method with Tensor for Automatic Differentiation

TP
Avancé
Author

Remi Genet

Published

2024-10-21

Understanding Tensors and Automatic Differentiation

The Power of Composition

Imagine you have a complex mathematical function. It might look intimidating at first, but it’s actually just a composition of simple operations. For example, consider this function:

\(f(x, y) = (x^2 + 2y) * \sin(x + y)\)

We can break this down into simpler operations:

  1. \(a = x^2\)
  2. \(b = 2y\)
  3. \(c = a + b\) (which is \(x^2 + 2y\))
  4. \(d = x + y\)
  5. \(e = \sin(d)\) (which is \(\sin(x + y)\))
  6. \(f = c * e\) (our final result)

Automatic Differentiation: It’s All About the Chain Rule

Now, here’s the magic: if we know how to differentiate each of these simple operations, we can automatically compute the derivative of the entire complex function. This is the essence of automatic differentiation.

Let’s say we want to find \(\frac{\partial f}{\partial x}\). We can use the chain rule:

\(\frac{\partial f}{\partial x} = \frac{\partial f}{\partial c} \cdot \frac{\partial c}{\partial x} + \frac{\partial f}{\partial e} \cdot \frac{\partial e}{\partial d} \cdot \frac{\partial d}{\partial x}\)

Breaking it down: - \(\frac{\partial f}{\partial c} = e\) - \(\frac{\partial c}{\partial x} = 2x\) - \(\frac{\partial f}{\partial e} = c\) - \(\frac{\partial e}{\partial d} = \cos(d)\) - \(\frac{\partial d}{\partial x} = 1\)

Putting it all together:

\(\frac{\partial f}{\partial x} = e \cdot 2x + c \cdot \cos(d) \cdot 1\)

Tensors: Tracking Operations and Their Derivatives

This is where tensors come in. In our implementation, a tensor will not only store its value but also remember: 1. The operation that created it 2. The tensors that were inputs to this operation 3. How to compute its derivative with respect to its inputs

By doing this for each operation, we create a computational graph. When we want to compute the derivative of our final result with respect to any input, we can simply walk backwards through this graph, applying the chain rule at each step.

TP Instructions: Implementing a Basic Tensor Class with Automatic Differentiation

Your task is to implement a simplified Tensor class that supports basic mathematical operations and automatic differentiation. This class will allow us to build simple computational graphs and compute gradients automatically.

Here’s a skeleton of the Tensor class to get you started:

Your tasks:

  1. Implement the reversed methods for already existing one like __radd__ or __rmul__.
  2. Implement the __sub__ and __truediv__ methods for subtraction and division operations.
  3. Add support for operations between Tensors and regular numbers (scalars) in all methods.
  4. Implement a sin() method that computes the sine of a Tensor.
  5. Add proper string representation methods (__repr__ and __str__).

Example Usage: Computing Gradients of a Complex Function

After implementing the Tensor class, you can use it to compute gradients of complex functions. Here’s an example using the function we discussed earlier:

This example demonstrates how your Tensor class can be used to automatically compute gradients of a complex function. The backward() method computes the gradients with respect to all input tensors.

Unitary Tests

Here are some unitary tests to verify your implementation. Implement your Tensor class in a file named tensor.py, and then use these tests to check your work:

Back to top
TP : Étendre la classe Liste avec des méthodes dunder

Programmation Orienté Object en Python, Rémi Genet.
Licence
Code source disponible sur Github

 

Site construit avec et Quarto
Inspiration pour la mise en forme du site ici
Code source disponible sur GitHub