index.community/README.md

55 lines
2 KiB
Markdown
Raw Normal View History

2019-02-26 16:02:39 +00:00
# fediverse.space 🌐
2019-02-20 17:05:25 +00:00
The map of the fediverse that you always wanted.
2019-02-20 16:49:45 +00:00
2019-02-22 15:56:47 +00:00
![A screenshot of fediverse.space](screenshot.png)
2019-02-20 16:49:45 +00:00
## Requirements
- For everything:
- Docker
- Docker-compose
- For the scraper + API:
- Python 3
- For laying out the graph:
- Java
- For the frontend:
- Yarn
2018-08-26 22:31:53 +00:00
## Running it
2019-02-20 16:49:45 +00:00
### Backend
- `cp example.env .env` and modify environment variables as required
- `docker-compose build`
2019-02-21 12:32:50 +00:00
- `docker-compose up -d django`
- if you don't specify `django`, it'll also start `gephi` which should only be run as a regular one-off job
2019-02-22 15:37:10 +00:00
- to run in production, run `caddy` rather than `django`
2019-02-20 16:49:45 +00:00
### Frontend
- `cd frontend && yarn install`
- `yarn start`
## Commands
### Backend
2019-02-20 17:29:02 +00:00
After running the backend in Docker:
2019-02-21 14:54:44 +00:00
- `docker-compose exec web python manage.py scrape` scrapes the fediverse
- It only scrapes instances that have not been scraped in the last 24 hours.
- By default, it'll only scrape 50 instances in one go. If you want to scrape everything, pass the `--all` flag.
2019-02-21 12:32:50 +00:00
- `docker-compose exec web python manage.py build_edges` aggregates this information into edges with weights
- `docker-compose run gephi java -Xmx1g -jar build/libs/graphBuilder.jar` lays out the graph
2019-02-20 16:49:45 +00:00
To run in production, use `docker-compose -f docker-compose.yml -f docker-compose.production.yml` instead of just `docker-compose`.
2019-03-08 16:25:53 +00:00
An example crontab:
```
# crawl 50 stale instances (plus any newly discovered instances from them)
# the -T flag is important; without it, docker-compose will allocate a tty to the process
15,45 * * * * docker-compose -f docker-compose.yml -f docker-compose.production.yml exec -T django python manage.py scrape
# build the edges based on how much users interact
15 3 * * * docker-compose -f docker-compose.yml -f docker-compose.production.yml exec -T django python manage.py build_edges
# layout the graph
20 3 * * * docker-compose -f docker-compose.yml -f docker-compose.production.yml run gephi java -Xmx1g -jar build/libs/graphBuilder.jar
```
2019-02-20 16:49:45 +00:00
### Frontend
- `yarn build` to create an optimized build for deployment