Get AI summaries of any video or article — Sign up free
Getting Started with TensorFlow.js | Deep Learning for JavaScript Hackers (Part 0) thumbnail

Getting Started with TensorFlow.js | Deep Learning for JavaScript Hackers (Part 0)

Venelin Valkov·
5 min read

Based on Venelin Valkov's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

TensorFlow.js enables TensorFlow tensor operations and model training in JavaScript, including in-browser workflows and Node.js usage.

Briefing

TensorFlow.js is positioned as a way to run machine-learning workflows directly in JavaScript—either in the browser or in Node.js—by bridging TensorFlow’s core C/C++ implementation to JS. That matters because it lets developers load pre-trained models created in Python/Keras and use them inside apps built with frameworks like React or Angular, without standing up a separate ML service. The setup starts with using TensorFlow.js documentation and demos, then moving into a “sandbox” project built with vanilla JavaScript.

The tutorial walks through installing TensorFlow.js and related dev dependencies, creating a basic HTML page, and wiring a script that waits for the DOM before running an async function. From there, it introduces tensors as TensorFlow.js’s core data structure: N-dimensional containers that hold the inputs and outputs used during model training. A first tensor example turns a JavaScript array into a 1D tensor, then checks its rank (dimensionality) and shape (size along each dimension). It also highlights a common gotcha: logging a tensor directly won’t show its values; calling methods like print (or an equivalent) is needed to reveal the underlying data.

Next comes building intuition for tensor manipulation and math. The tutorial creates 2D tensors (matrices) from 2D arrays, including an example using strings to show tensors can technically hold non-numeric values even though ML workflows typically use numbers. It demonstrates utility operations such as reshape (turning a vector into a matrix with specified rows and columns) and basic element-wise arithmetic like addition between tensors. For linear algebra, it uses dot product to multiply matrices/vectors and explains the row-by-column multiplication behavior. It also covers transpose, showing how flipping rows and columns changes the resulting matrix—an operation that generalizes to higher dimensions even if it’s harder to visualize.

To make results visible in a web app, the tutorial adds TFJS Vis, a visualization library that renders charts in the browser. It creates a bar chart by supplying an array of objects with index and value fields, then renders it into a custom div container in the HTML. It briefly notes styling options like axis labels and chart height.

The core “getting started” payoff is a minimal supervised-learning model: converting kilograms to pounds. The model is built with tf.sequential and a single dense layer with one unit, meaning it learns one parameter (the conversion factor). Training data is generated synthetically: 2,000 examples of random kilograms with corresponding pounds labels computed using the known relationship (~2.2). The model is compiled with mean squared error as the loss function and Adam as the optimizer, then trained for many epochs with shuffling enabled to avoid learning spurious ordering.

After training, the model predicts pounds for an input like 10 kg and produces a value much closer to the expected 22.2 than an untrained model. The contrast—untrained predictions being wildly off, trained predictions improving substantially—serves as the practical proof that TensorFlow.js can learn in-browser and that the tensor/math pipeline is working end-to-end. The next planned step is applying the same workflow to a real dataset for diabetes prediction and evaluating training performance.

Cornell Notes

TensorFlow.js brings TensorFlow’s tensor operations and model training to JavaScript, enabling ML in the browser or in Node.js and reuse of pre-trained models. The tutorial treats tensors as the central data structure: rank and shape describe dimensionality, and tensor values require explicit printing rather than direct console logging. It demonstrates key tensor operations—reshape, element-wise add, dot product, and transpose—then uses TFJS Vis to render a bar chart from simple data objects. Finally, it builds a tiny sequential model to learn a kilograms-to-pounds conversion factor using mean squared error loss and the Adam optimizer, showing that training dramatically improves predictions compared with an untrained model.

Why are tensors the “core data structure” in TensorFlow.js, and how do rank and shape help describe them?

Tensors act as N-dimensional containers for the data that models consume and produce. Rank indicates how many dimensions the tensor has (e.g., a 1D tensor has rank 1). Shape describes the size of each dimension; for example, a 1D tensor created from a 3-element array has shape [3], while a 2D tensor built from a 2×3 array has shape [2, 3]. These properties matter because many operations (like dot products and reshapes) require compatible dimensions.

What’s the practical difference between logging a tensor and printing its values?

Directly logging a tensor often won’t show the underlying numeric/string values you expect. To see the contents, the tutorial uses a print method (e.g., tf.Tensor.print or an equivalent) to display the inner values. This distinction is important for debugging: you can confirm rank/shape from metadata, but you need explicit printing to verify actual data.

How do reshape, dot product, and transpose change tensor data, and when are they useful?

Reshape changes the arrangement of elements without changing the total number of elements—for instance, turning a vector into a 2×3 matrix. Dot product performs row-by-column multiplication between compatible matrices/vectors, producing a new tensor whose dimensions depend on the input shapes. Transpose swaps rows and columns (e.g., a 2×2 matrix [[1,2],[3,4]] becomes [[1,3],[2,4]]), and the operation generalizes to higher dimensions even though visualization gets harder.

How does TFJS Vis render a chart, and what data format does it expect for a bar chart?

TFJS Vis renders charts into a DOM element (a div container). For a bar chart, it expects an array of objects where each object includes an index and a value. The tutorial creates sample data like {index: 'Jane', value: 10} and then calls a renderBarChart-like function with the container element and that data array. Styling options include axis labels and chart height.

What does the kilograms-to-pounds model learn, and why does training improve predictions?

The model is a tf.sequential network with one dense layer and one unit, so it learns a single parameter representing the conversion factor. Training data pairs kilograms (features) with pounds (labels) generated from the known relationship (~2.2). Using mean squared error as the loss and Adam as the optimizer, gradient-based updates adjust the learned parameter to reduce prediction error. After training, predictions for inputs like 10 kg move from wildly incorrect values (untrained) toward the expected ~22.2.

Why shuffle training data, and what does “epochs” mean in this setup?

Shuffle prevents the model from relying on any accidental ordering in the training examples (e.g., not treating earlier samples as temporally related to later ones). Epochs define how many times the model sees the full training dataset; in the tutorial, the model is trained for many epochs, repeatedly updating weights to minimize mean squared error.

Review Questions

  1. How do rank and shape differ, and how would you verify them for a 2D tensor created from a 2×3 array?
  2. What conditions must hold for dot product to work between two tensors, and how does transpose affect those conditions?
  3. In the kilograms-to-pounds example, what roles do mean squared error and Adam play during training?

Key Points

  1. 1

    TensorFlow.js enables TensorFlow tensor operations and model training in JavaScript, including in-browser workflows and Node.js usage.

  2. 2

    Tensors are N-dimensional data containers; rank indicates dimensionality and shape indicates the size along each dimension.

  3. 3

    Tensor values often require explicit printing to inspect contents, while rank/shape can be checked via tensor metadata.

  4. 4

    Core tensor operations demonstrated include reshape, element-wise arithmetic, dot product, and transpose.

  5. 5

    TFJS Vis can render charts directly into a chosen div container, using simple data objects (index/value) for bar charts.

  6. 6

    A minimal supervised model for unit conversion can be built with tf.sequential and a single dense layer, trained using mean squared error loss and the Adam optimizer.

  7. 7

    Training quality is validated by comparing predictions from an untrained model versus a trained model on the same input.

Highlights

Tensors are treated as the universal data format for ML in TensorFlow.js: rank and shape guide how data can be manipulated and combined.
Dot product is implemented as row-by-column multiplication for compatible matrices/vectors, while transpose swaps rows and columns to change matrix orientation.
A one-parameter sequential model can learn the kilograms-to-pounds conversion factor from synthetic training pairs, and training dramatically improves predictions.

Topics

Mentioned

  • TensorFlow.js
  • TFJS Vis
  • TFJS
  • TF