Skip to content

Commit a1c9c6c

Browse files
Merge pull request #5 from alhirzel/patch-2
Fix typo, "nueral"
2 parents eaa47ad + 202f8d2 commit a1c9c6c

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

lecture2/ml.jmd

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ date: January 6th, 2020
77
Let's start by discussing how to use Julia for machine learning from the
88
context of scientific machine learning. The core of machine learning is the
99
Universal Approximation Theroem (UAT) which states that any sufficiently nice
10-
function can be approximated by a sufficiently large nueral network. Since this
10+
function can be approximated by a sufficiently large neural network. Since this
1111
is what we will be looking at in practice, let's get started with training
1212
neural networks to match functions instead of data.
1313

lecture3/diffeq_ml.jmd

+1-1
Original file line numberDiff line numberDiff line change
@@ -223,7 +223,7 @@ augmented_data = vcat(ode_data,zeros(1,size(ode_data,2)))
223223

224224
### The Universal Ordinary Differential Equation
225225

226-
This formulation of the nueral differential equation in terms of a "knowledge-embedded"
226+
This formulation of the neural differential equation in terms of a "knowledge-embedded"
227227
structure is leading. If we already knew something about the differential equation,
228228
could we use that information in the differential equation definition itself?
229229
This leads us to the idea of the universal differential equation, which is a

0 commit comments

Comments
 (0)