5
5
6
6
MyGrad
7
7
======
8
- MyGrad is a lightweight library that adds automatic differentiation to NumPy – its only dependency is NumPy!
8
+ MyGrad is a lightweight library that adds automatic differentiation to NumPy – its only
9
+ dependency is NumPy. Simply "drop in" a MyGrad tensor into your NumPy-based code, and
10
+ start differentiating!
9
11
10
- .. code :: python
12
+ .. code-block :: pycon
11
13
12
14
>>> import mygrad as mg
13
15
>>> import numpy as np
14
16
15
- >> > x = mg.tensor([1 ., 2 ., 3 .]) # like numpy.array, but supports backprop!
16
- >> > f = np.sum(x * x) # tensors work with numpy functions!
17
+ >>> x = mg.tensor([1., 2., 3.]) # like numpy.array, but supports backprop
18
+ >>> f = np.sum(x * x) # tensors can be passed directly to native numpy functions!
17
19
>>> f.backward() # triggers automatic differentiation
18
20
>>> x.grad # stores [df/dx0, df/dx1, df/dx2]
19
21
array([2., 4., 6.])
@@ -26,7 +28,7 @@ Of the various modes and flavors of auto-diff, MyGrad supports backpropagation f
26
28
NumPy's ufuncs are richly supported. We can even differentiate through an operation that occur in-place on a tensor and applies a boolean mask to
27
29
the results:
28
30
29
- .. code :: python
31
+ .. code-block :: pycon
30
32
31
33
>>> x = mg.tensor([1., 2., 3.])
32
34
>>> y = mg.zeros_like(x)
@@ -40,7 +42,7 @@ NumPy's `view semantics <https://www.pythonlikeyoumeanit.com/Module3_Introducing
40
42
indexing and similar operations on tensors will produce a "view" of that tensor's data, thus a tensor and its view share memory.
41
43
This relationship will also manifest between the derivatives stored by a tensor and its views!
42
44
43
- .. code :: python
45
+ .. code-block :: pycon
44
46
45
47
>>> x = mg.arange(9.).reshape(3, 3)
46
48
>>> diag_view = np.einsum("ii->i", x) # returns a view of the diagonal elements of `x`
@@ -75,7 +77,7 @@ This relationship will also manifest between the derivatives stored by a tensor
75
77
76
78
Basic and advanced indexing is fully supported
77
79
78
- .. code :: python
80
+ .. code-block :: pycon
79
81
80
82
>>> (x[x < 4] ** 2).backward()
81
83
>>> x.grad
@@ -87,7 +89,7 @@ Basic and advanced indexing is fully supported
87
89
NumPy arrays and other array-likes play nicely with MyGrad's tensor. These behave like constants
88
90
during automatic differentiation
89
91
90
- .. code :: python
92
+ .. code-block :: pycon
91
93
92
94
>>> x = mg.tensor([1., 2., 3.])
93
95
>>> constant = [-1., 0., 10] # can be a numpy array, list, or any other array-like
0 commit comments