Skip to content

Commit a781700

Browse files
committed
improve intro
1 parent 17fba10 commit a781700

File tree

1 file changed

+10
-8
lines changed

1 file changed

+10
-8
lines changed

docs/source/index.rst

Lines changed: 10 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -5,15 +5,17 @@
55
66
MyGrad
77
======
8-
MyGrad is a lightweight library that adds automatic differentiation to NumPy – its only dependency is NumPy!
8+
MyGrad is a lightweight library that adds automatic differentiation to NumPy – its only
9+
dependency is NumPy. Simply "drop in" a MyGrad tensor into your NumPy-based code, and
10+
start differentiating!
911

10-
.. code:: python
12+
.. code-block:: pycon
1113
1214
>>> import mygrad as mg
1315
>>> import numpy as np
1416
15-
>>> x = mg.tensor([1., 2., 3.]) # like numpy.array, but supports backprop!
16-
>>> f = np.sum(x * x) # tensors work with numpy functions!
17+
>>> x = mg.tensor([1., 2., 3.]) # like numpy.array, but supports backprop
18+
>>> f = np.sum(x * x) # tensors can be passed directly to native numpy functions!
1719
>>> f.backward() # triggers automatic differentiation
1820
>>> x.grad # stores [df/dx0, df/dx1, df/dx2]
1921
array([2., 4., 6.])
@@ -26,7 +28,7 @@ Of the various modes and flavors of auto-diff, MyGrad supports backpropagation f
2628
NumPy's ufuncs are richly supported. We can even differentiate through an operation that occur in-place on a tensor and applies a boolean mask to
2729
the results:
2830

29-
.. code:: python
31+
.. code-block:: pycon
3032
3133
>>> x = mg.tensor([1., 2., 3.])
3234
>>> y = mg.zeros_like(x)
@@ -40,7 +42,7 @@ NumPy's `view semantics <https://www.pythonlikeyoumeanit.com/Module3_Introducing
4042
indexing and similar operations on tensors will produce a "view" of that tensor's data, thus a tensor and its view share memory.
4143
This relationship will also manifest between the derivatives stored by a tensor and its views!
4244

43-
.. code:: python
45+
.. code-block:: pycon
4446
4547
>>> x = mg.arange(9.).reshape(3, 3)
4648
>>> diag_view = np.einsum("ii->i", x) # returns a view of the diagonal elements of `x`
@@ -75,7 +77,7 @@ This relationship will also manifest between the derivatives stored by a tensor
7577
7678
Basic and advanced indexing is fully supported
7779

78-
.. code:: python
80+
.. code-block:: pycon
7981
8082
>>> (x[x < 4] ** 2).backward()
8183
>>> x.grad
@@ -87,7 +89,7 @@ Basic and advanced indexing is fully supported
8789
NumPy arrays and other array-likes play nicely with MyGrad's tensor. These behave like constants
8890
during automatic differentiation
8991

90-
.. code:: python
92+
.. code-block:: pycon
9193
9294
>>> x = mg.tensor([1., 2., 3.])
9395
>>> constant = [-1., 0., 10] # can be a numpy array, list, or any other array-like

0 commit comments

Comments
 (0)