Run this notebook

Use Livebook to open this notebook and explore new ideas.

It is easy to get started, on your machine or the cloud.

Click below to open and run it in your Livebook at .

(or change your Livebook location)

# Programming Machine Learning - Chapter 4 ```elixir Mix.install( [ {:nx, "~> 0.4"}, {:explorer, "~> 0.4"}, {:exla, "~> 0.4"} ], config: [ nx: [ default_backend: EXLA.Backend, default_defn_options: [compiler: EXLA] ] ] ) ``` ## Load data ```elixir defmodule Data do def load(file, label_column) do {:ok, data} = file |> File.stream!() |> Enum.reduce([], fn line, acc -> line = line |> String.trim() |> String.split() |> Enum.join(",") [acc | [line, "\n"]] end) |> :binary.list_to_bin() |> Explorer.DataFrame.load_csv() x_series = data |> Explorer.DataFrame.discard(label_column) |> Explorer.DataFrame.to_series() |> Enum.map(fn {_name, values} -> Explorer.Series.to_tensor(values) end) {x_cols} = Enum.at(x_series, 0) |> Nx.shape() ones = Nx.broadcast(1, {x_cols}) x = [ones | x_series] |> Nx.stack() |> Nx.transpose() y = data |> Explorer.DataFrame.pull(label_column) |> Explorer.Series.to_tensor() |> Nx.reshape({x_cols, 1}) {x, y} end end ``` ## Training ```elixir defmodule HyperSpace do import Nx.Defn def train(x, y, iterations, lr) do {_x_rows, x_cols} = Nx.shape(x) seed = DateTime.utc_now() |> DateTime.to_unix() {w, _new_key} = Nx.Random.normal(Nx.Random.key(seed), shape: {x_cols, 1}) for _ <- 1..iterations, reduce: w do w_acc -> update(x, y, lr, w_acc) end end def train_batched(x, y, iterations, lr) do {_x_rows, x_cols} = Nx.shape(x) seed = DateTime.utc_now() |> DateTime.to_unix() {w, _new_key} = Nx.Random.normal(Nx.Random.key(seed), shape: {x_cols, 1}) x_stream = Nx.to_batched(x, 10) y_stream = Nx.to_batched(y, 10) stream = Stream.zip(x_stream, y_stream) for _ <- 1..iterations, reduce: w do w_acc -> Enum.reduce( stream, w_acc, fn {x, y}, w_acc -> update(x, y, lr, w_acc) end ) end end # -- Private defnp predict(x, w) do Nx.dot(x, w) end defnp loss(x, y, w) do Nx.mean((predict(x, w) - y) ** 2) end defnp gradient(x, y, w) do grad(w, &loss(x, y, &1)) end defnp update(x, y, lr, w) do w - gradient(x, y, w) * lr end end ``` ## Pizzas ```elixir {x, y} = Data.load("#{__DIR__}/../book/04_hyperspace/pizza_3_vars.txt", "Pizzas") ``` ```elixir HyperSpace.train(x, y, _iterations = 100_000, _learning_rate = 0.001) ``` ```elixir HyperSpace.train_batched(x, y, _iterations = 100_000, _learning_rate = 0.001) ``` <!-- livebook:{"branch_parent_index":1} --> ## Life expectancy ```elixir {x, y} = Data.load( "#{__DIR__}/../book/data/life-expectancy/life-expectancy-without-country-names.txt", "Life" ) ``` A small learning rate is required, otherwise we end up with an overflow: <!-- livebook:{"force_markdown":true} --> ```elixir #Nx.Tensor< f64[4][1] EXLA.Backend<host:0, 0.2555518063.1281753108.145331> [ [NaN], [NaN], [NaN], [NaN] ] > ``` ```elixir HyperSpace.train(x, y, _iterations = 1_000_000, _learning_rate = 0.0001) ```
See source

Have you already installed Livebook?

If you already installed Livebook, you can configure the default Livebook location where you want to open notebooks.
Livebook up Checking status We can't reach this Livebook (but we saved your preference anyway)
Run notebook

Not yet? Install Livebook in just a minute

Livebook is open source, free, and ready to run anywhere.

Run in the cloud

on select platforms

To run on Linux, Docker, embedded devices, or Elixir’s Mix, check our README.

PLATINUM SPONSORS
SPONSORS
Code navigation with go to definition of modules and functions Read More