Commit 1067735f authored by Pierre Letessier's avatar Pierre Letessier Committed by Matthijs Douze

Add Dockerfile (#55)

* Add Dockerfile

* Update documentation with Docker build/run details
parent 0faf794e
sift1M
\ No newline at end of file
FROM nvidia/cuda:8.0-devel-ubuntu16.04
MAINTAINER Pierre Letessier <pletessier@ina.fr>
RUN apt-get update -y
RUN apt-get install -y libopenblas-dev python-numpy python-dev swig git python-pip wget
RUN pip install matplotlib
RUN cd /opt/ && git clone https://github.com/facebookresearch/faiss.git
WORKDIR /opt/faiss
RUN mv example_makefiles/makefile.inc.Linux ./makefile.inc
RUN sed -i -e 's%^BLASLDFLAGS=/usr/lib64/libopenblas.so.0%# BLASLDFLAGS=/usr/lib64/libopenblas.so.0%g' -e 's%^# BLASLDFLAGS=/usr/lib/libopenblas.so.0$%BLASLDFLAGS=/usr/lib/libopenblas.so.0%g' makefile.inc
RUN make tests/test_blas -j $(nproc) && \
make -j $(nproc) && \
make tests/demo_sift1M -j $(nproc)
RUN make py
RUN cd gpu && \
make -j $(nproc) && \
make test/demo_ivfpq_indexing_gpu && \
make py
# RUN ./tests/test_blas && \
# tests/demo_ivfpq_indexing
# RUN wget ftp://ftp.irisa.fr/local/texmex/corpus/sift.tar.gz && \
# tar xf sift.tar.gz && \
# mv sift sift1M
# RUN tests/demo_sift1M
...@@ -15,6 +15,8 @@ involved: ...@@ -15,6 +15,8 @@ involved:
Steps 2 and 3 depend on 1, but they are otherwise independent. Steps 2 and 3 depend on 1, but they are otherwise independent.
Alternatively, all 3 steps above can be run by building a Docker image (see section "Docker instructions" below).
General compilation instructions General compilation instructions
================================ ================================
...@@ -247,6 +249,31 @@ The auto-tuning example above also runs on the GPU. Edit ...@@ -247,6 +249,31 @@ The auto-tuning example above also runs on the GPU. Edit
to enable and run it. to enable and run it.
Docker instructions
===================
For using GPU capabilities of Faiss, you'll need to run "nvidia-docker"
rather than "docker". Make sure that docker
(https://docs.docker.com/engine/installation/) and nvidia-docker
(https://github.com/NVIDIA/nvidia-docker) are installed on your system
To build the "faiss" image, run
nvidia-docker build -t faiss .
If you want to run the tests during the docker build, uncomment the
last 3 "RUN" steps in the Dockerfile. But you might want to run the
tests by yourself, so just run
nvidia-docker run -ti --name faiss faiss bash
and run what you want. If you need a dataset (like sift1M), download it
inside the created container, or better, mount a directory from the host
nvidia-docker run -ti --name faiss -v /my/host/data/folder/ann_dataset/sift/:/opt/faiss/sift1M faiss bash
Hot to use Faiss in your own projects Hot to use Faiss in your own projects
===================================== =====================================
......
...@@ -12,7 +12,7 @@ The GPU implementation can accept input from either CPU or GPU memory. On a serv ...@@ -12,7 +12,7 @@ The GPU implementation can accept input from either CPU or GPU memory. On a serv
## Building ## Building
The library is mostly implemented in C++, with optional GPU support provided via CUDA, and an optional Python interface. The CPU version requires a BLAS library. It compiles with a Makefile. See [INSTALL](INSTALL) for details. The library is mostly implemented in C++, with optional GPU support provided via CUDA, and an optional Python interface. The CPU version requires a BLAS library. It compiles with a Makefile and can be packaged in a docker image. See [INSTALL](INSTALL) for details.
## How Faiss works ## How Faiss works
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment