Numerical Analysis

Foundations and Exercises

TANAKA, Kazuaki (田中 一成)

Global Center for Science and Engineering, Waseda University

Before We Start

📥
Please clone (or pull) the course materials repository right now.
First time
git clone https://github.com/waseda-num-analysis-2026/materials
Already cloned
cd materials
git pull

💡 Not sure how? Just ask your AI — it can run these commands for you.

Today's materials are inside 040/:
4th-handout.qmd 📘 main material — full content of this lecture
4th-handout.html rendered handout — viewable in browser
4th.qmd slides (this file) — exercises & instructions only
4th.html rendered slides

Today’s Main Material

LECTURE 4
Vector Norms and Matrix Norms
The handout is the textbook. These slides are only for class logistics and assignments.
Today you will submit
  • Ex 4.0: AI Q&A blocks in the handout
  • Ex 4.1: proof of norm properties
  • Ex 4.2: numerical verification
  • Ex 4.3: random-search estimate of matrix norms

Reminder — Session 3 Assignments

REMINDER
Ex 3.0–3.3 deadlines were extended to May 14 (Thu), 23:59 JST.
🎥 Ex 3.2 video is still required. Pair work using the video will be held in Session 5 or later.
If you have grading or assignment questions, reply in the relevant GitHub Issue or ask in Mattermost #question.

Exercise 4.0 — Evolve the Handout with AI

Repository: same as Ex 4.1   |   Submit: 4th-handout.qmd (your edited copy)   |   Deadline: May 14 (Thu), 23:59

While studying 4th-handout.qmd, insert at least 3 Q&A blocks of your own — questions you actually had while reading, answered with the help of GitHub Copilot (or another AI) following the protocol in AI_TUTOR.md.

How to submit

  1. Copy materials/040/4th-handout.qmd into the root of your Ex 4 repository
  2. Open it in VS Code, then for each question:
    • First prime the chat once with AI_TUTOR.md
    • Highlight the line you don’t understand
    • Press ⌘L / Ctrl+L, then ask the AI to add a Q&A block
    • Verify the AI’s answer, then commit the inserted Q&A block
  3. Repeat for ≥ 3 distinct questions, then git push

💡 The Q&A block format is explained at the top of 4th-handout.qmd.

Exercise 4.1 — Prove the Standard Norms Are Norms

Repository: Ex 4 — GitHub Classroom link   |   Edit: ex4-1.qmd   |   Deadline: May 14 (Thu), 23:59

Prove that \(\ell^1\), \(\ell^2\), and \(\ell^\infty\) are vector norms on \(\mathbb{R}^n\).

For each norm, verify:

  1. Positivity
  2. Homogeneity
  3. Triangle inequality

⚠️ AI is not recommended for the proof itself. You may use AI to convert handwritten equations to LaTeX, but the mathematical reasoning should be yours.

Exercise 4.2 — Verify Norm Inequalities Numerically

Repository: same as Ex 4.1   |   Edit: ex4-2.qmd   |   Deadline: May 14 (Thu), 23:59

Confirm the following inequalities numerically using random vectors:

\[ \|x\|_2 \leq \|x\|_1 \leq \sqrt{n}\,\|x\|_2, \qquad \|x\|_\infty \leq \|x\|_1 \leq n\,\|x\|_\infty, \]

\[ \|x\|_\infty \leq \|x\|_2 \leq \sqrt{n}\,\|x\|_\infty. \]

Minimum requirement

  • Test several dimensions, e.g. n = 2, 5, 10, 50
  • Use many random vectors for each dimension
  • Report clearly whether all tests passed

Exercise 4.3 — Random Search for Matrix Norms

Repository: same as Ex 4.1   |   Edit: ex4-3.qmd   |   Deadline: May 14 (Thu), 23:59

Estimate the induced matrix norm \(\|A\|_p\) from below by sampling random vectors.

AI use is allowed, but you must verify and explain the results yourself.

  • Choose at least two matrices \(A\); include one \(2 \times 2\) example for geometric discussion.
  • For \(p = 1, 2, \infty\), sample many nonzero vectors and compute \(r_p(x)=\|Ax\|_p/\|x\|_p\).
  • For one \(2 \times 2\) matrix, try several sample sizes, e.g. N = 10, 100, 1000, 10000; report the sampled maximum, true value, and relative gap.
  • Compare the sampled maximum with the true value: is it above, below, or equal? Explain your answer. Then discuss why one case is close and another case is not.

Exercise 4.3 — Optional Challenge

For \(p = 2\), do the following extra analysis for one \(2 \times 2\) matrix:

\[ \|A\|_2 = \sqrt{\lambda_{\max}(A^T A)}. \]

  • Compute \(\lambda_{\max}(A^T A)\) and confirm that it gives the same value as numpy.linalg.norm(A, ord=2).
  • Compute an eigenvector \(v_{\max}\) of \(A^T A\) for \(\lambda_{\max}\), and check the stretch ratio \(\|Av_{\max}\|_2 / \|v_{\max}\|_2\).
  • Compare \(v_{\max}\) with the best direction found by your random search.
  • Briefly explain why this direction gives the maximum stretching factor.

This bonus is beyond today’s main lecture. It is for students who want to connect matrix norms with eigenvalues and maximum stretching directions.

Assignments and Preparation for Next Lecture

1

Accept the Ex 4 GitHub Classroom link: https://classroom.github.com/a/IIB7mFi7

  • One repository will contain Ex 4.0, 4.1, 4.2, and 4.3
  • Edit ex4-1.qmd, ex4-2.qmd, ex4-3.qmd + your copy of 4th-handout.qmd for Ex 4.0
  • Deadline: May 14 (Thu), 23:59
  • Submit by git push — the last commit before the deadline is graded
2

Keep working on Ex 3.0–3.3 if not finished yet

  • Extended deadline: May 14 (Thu), 23:59 JST
  • Ex 3.2 video pair work will be announced later