Skip to content
Snippets Groups Projects
Commit 9167f1c1 authored by Peter J. Keleher's avatar Peter J. Keleher
Browse files

auto

parent e32225da
No related branches found
No related tags found
No related merge requests found
......@@ -29,7 +29,7 @@ the YARN resource manager.
### Vagrant
This is the **recommended** way to do this project.
This is a fine way to do this project, though Docker is a bit more streamlined if you already have docker locally.
As before, we have provided a VagrantFile in the `assignment9` directory. Since
the Spark distribution is large, we ask you to download that directly from the
......@@ -42,18 +42,27 @@ This step is included in the VagrantFile, but if you get any error
We are ready to use Spark.
### Mac and Homebrew
I'd rather not pollute my mac w/ all the detritus from starting
`spark` up, so I have not pioneered a mac approach, though you are
free to.
If you have apple silicon, I recommend you wait until the docker
approach has been checked out (soon).
### Docker
Probably before Thanksgiving.
This is the **recommended** way to run the project if you have **Apple
silicon**, and you get to learn about
containers at the same time. Docker [Get
Started](https://www.docker.com/get-started/) describes setup.
going, and this **does work w/ apple silicon**. Probably works well w/
windows as well, but I have no direct experience with that.
Steps:
- [Install docker](https://www.docker.com/get-started/)
- Build your image: `docker build -t assign9 .`
- Start a container based on that image, and attach to a bash shell in it: `docker run -v "$(PWD)":/assign9 -it assign9`.
- You will drop right into `/assign9`, which is where the enclosing
directory is mounted in the container.
- Any changes you make either in this container directory, or
outside in the shell of your host machine are reflected on the
other side.
- The container will shut down as soon as you exit the shell.
- Get rid of exited containers via `docker container prune -f`.
- Ignore the version of spark in the distro.
## Spark and Python
Spark primarily supports three languages: Scala (Spark is written in Scala),
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment