backend | ||
config | ||
frontend | ||
gephi | ||
.gitignore | ||
docker-compose.production.yml | ||
docker-compose.yml | ||
example.env | ||
LICENSE | ||
README.md |
fediverse.space 🌐
The map of the fediverse that you always wanted.
Requirements
- For everything:
- Docker
- Docker-compose
- For the scraper + API:
- Python 3
- For laying out the graph:
- Java
- For the frontend:
- Yarn
Running it
Backend
cp example.env .env
and modify environment variables as requireddocker-compose build
docker-compose up -d django
- if you don't specify
django
, it'll also startgephi
which should only be run as a regular one-off job
- if you don't specify
Frontend
cd frontend && yarn install
yarn start
Commands
Backend
After running the backend in Docker:
docker-compose exec web python manage.py scrape
scrapes the entire fediversedocker-compose exec web python manage.py build_edges
aggregates this information into edges with weightsdocker-compose run gephi java -Xmx1g -jar build/libs/graphBuilder.jar
lays out the graph
To run in production, use docker-compose -f docker-compose.yml -f docker-compose.production.yml
instead of just docker-compose
.
Frontend
yarn build
to create an optimized build for deployment