diff --git a/assign9.md b/assign9.md
index 19ed9b3064b1aa829b5b9b799e606e81ab6eca03..a514f962f7553798558e474ecd77432a4b3a0722 100644
--- a/assign9.md
+++ b/assign9.md
@@ -29,7 +29,7 @@ the YARN resource manager.
 
 
 ### Vagrant
-This is the **recommended** way to do this project. 
+This is a fine way to do this project, though Docker is a bit more streamlined if you already have docker locally. 
 
 As before, we have provided a VagrantFile in the `assignment9` directory. Since
 the Spark distribution is large, we ask you to download that directly from the
@@ -42,18 +42,27 @@ This step is included in the VagrantFile, but if you get any error
 
 We are ready to use Spark.
 
-### Mac and Homebrew
-I'd rather not pollute my mac w/ all the detritus from starting
-`spark` up, so I have not pioneered a mac approach, though you are
-free to.
-
-If you have apple silicon, I recommend you wait until the docker
-approach has been checked out (soon).
-
 ### Docker
-Probably before Thanksgiving.
-
-
+This is the **recommended** way to run the project if you have **Apple
+silicon**, and you get to learn about
+containers at the same time. Docker [Get
+Started](https://www.docker.com/get-started/) describes setup. 
+going, and this **does work w/ apple silicon**. Probably works well w/
+windows as well, but I have no direct experience with that. 
+
+Steps:
+- [Install docker](https://www.docker.com/get-started/)
+- Build your image: `docker build -t assign9 .`
+- Start a container based on that image, and attach to a bash shell in it: `docker run -v "$(PWD)":/assign9 -it assign9`.
+  - You will drop right into `/assign9`, which is where the enclosing
+    directory is mounted in the container.
+  - Any changes you make either in this container directory, or
+    outside in the shell of your host machine are reflected on the
+    other side.
+  - The container will shut down as soon as you exit the shell.
+  - Get rid of exited containers via `docker container prune -f`.
+  - Ignore the version of spark in the distro.
+  
 ## Spark and Python
 
 Spark primarily supports three languages: Scala (Spark is written in Scala),