Well, what you hate is the way that math was taught to you.

That soup of equations, abstractions, and solutions to problems that we don’t know, It's hard to enjoy the things we don't feel part of.

But how about relating some math techniques from the world that surrounds us? Would it be possible to re-discover math? Would you give it a try?

So many questions! Let’s relax with a cute kitten:

Lovely 😍!

There is a difference between solving hundreds of equations, solving what “*X is equal to*”... or, well, from learning about other mathematical concepts, like “derivatives” (https://en.wikipedia.org/wiki/Derivative).

Again, we solve dozens of different problems involving derivatives. We practice, we solve, and solve again and, after a while, we have forgotten the solutions to the first exercises.

For sure, there are people comfortable with solving such problems, developing new mathematical theorems, or solving Fermat’s last theorem (http://www.telegraph.co.uk/science/2016/03/20/why-its-so-impressive-that-fermats-last-theorum-has-been-solved/), which is a branch of mathematics dedicated to abstract math. Let’s name it as *pure mathematics*.

Enlarge image.

Source: http://dominicwalliman.com/

If you like math at this point, Eureka! If you don’t, then I invite you to keep reading… We’ll go to the *applied mathematics field*.

### My personal experience

I was kind of reluctant about math when I studied it in the standard way in high school and college. Exercise guides were weighed in kilograms and students became adept ninjas at solving math problems while forgetting the purpose of all of it (other than to pass the exam) and without knowing the real usage of all of this.

### “Why am I learning this?” -- That's the question

Everything started to change in college when I assisted in a seminar called “Artificial Intelligence and Robotics.” (What a cool name!)

Once there, I experienced what I wrote of before: The “meaning” came to life for me when the speaker was explaining artificial neural networks and said:

“The artificial neural network can learn...” 🙀 Wow!

They learn thanks to an algorithm called backpropagation.... by using derivatives!!!

So derivatives are useful for something other than passing exams! Let’s illustrate this point:

"We fear what we don't understand." -- Many people.

### Math in practice!

This algorithm (backpropagation) is highly related to the “Gradient Descent” used nowadays in Deep Learning (Artificial Intelligence) under the TensorFlow framework.

DeepLearning—the fancy name for Neural Networks introduced by Google, (https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/45530.pdf).

You don’t have to know how to do a derivative. However, understanding what is behind it can help you to:

a) Train your logical skills (like solving a Sudoku in hard level)

b) Develop new algorithms.

c) *Others*

If you don’t, relax and just keep on the high level while using the procedure as a black-box—but with a **deep understanding of what the inputs and outputs are**.

--

*Disclaimer*: I’m not a fanatic of derivatives; in fact, I don’t even remember how to derive today, only the “exponential function” because of the jokes:

Source: Reddit Jokes

--

Anyway, my personal experience **is not** relevant here. What matters at this point is:

📌 When we learn anything new (even more when it is abstract), it’s handy to **find a meaning** by researching using questions like: “what is it useful for,” “who is using it,” “is it similar to anything that I already know,” “what’s the intuition behind,” and so on...

📌 Another key point I found useful when learning math is to **learn by coding**!

📌 Look for examples—for sure there are plenty around us. Play and destroy the code by changing its parameters. Use the powerful technique of **trial-and-error**.

### Our impartial friend: Trial-and-error

We try once and we end up with a big error. We try again and get a smaller error and, finally, our last try yields the smallest error!

This error is so small that it is considered that we have learned!

(This is an oversimplification but it is how deep learning learns 😉)

Didn’t you know it? Now you do. That is an example of learning by intuition, by using what is already known.

See a real example below. That is how deep learning learns (using R):

Source: https://keras.rstudio.com/articles/training_visualization.html

In deep learning terminology:

- The loss can be seen as the error.
- The accuracy is… the accuracy.
- And each epoch is the time.

*Learn an intro about deep learning at:* https://medium.freecodecamp.org/want-to-know-how-deep-learning-works-heres-a-quick-guide-for-everyone-1aedeca88076

#### Reverse engineering: From artificial intelligence to “real life”

When we study for an exam, we practice several times (epochs) with an exercise guide that we know has the results (blue line), so we learn how to solve those problems (making the accuracy going higher).

Now the teacher sets an exam in which we don’t know the results and she/he evaluates with the known answer (orange line).

### Final words

When we learn, it’s handy to do so **by association** (just like we did in the last paragraphs).

Rather than being a “data warehouse of equations”, we as human beings are more suitable for interconnecting things, **finding purpose** in the knowledge we want to explore, and **questioning** what is presented in front of us.

Relate concepts, use your actual “real-life” knowledge, practice, and find meaning for what you do.

That’s the message of this post.

*Oh.. still there? Check this out* 👇

I can't guarantee that it will be easy to understand, but I invite you all to read the open-source book I just published, the "Data Science Live Book" 📗.

That's my best try for now in teaching data science from a practical and intuitive perspective.

The book is fully-accessible at: http://livebook.datascienceheroes.com 🚀

Thanks 🙂

--

TW: @pabloc_ds.