Go to file
2019-07-17 20:09:45 +00:00
backend set up sobelow for backend 2019-07-17 16:16:25 +00:00
frontend set up sobelow for backend 2019-07-17 16:16:25 +00:00
gephi refactor/elixir backend 2019-07-14 11:47:06 +00:00
.dokku-monorepo fix backend app name in .dokku-monorepo 2019-07-14 13:12:35 +01:00
.gitignore set up sobelow for backend 2019-07-17 16:16:25 +00:00
.gitlab-ci.yml Update .gitlab-ci.yml 2019-07-17 20:09:45 +00:00
BILL-OF-MATERIALS.md add sigmajs to bill of materials 2019-04-30 18:23:20 +01:00
CHANGELOG Add CHANGELOG 2019-07-13 19:02:29 +00:00
docker-compose.yml consolidate into one docker-compose.yml 2019-07-14 14:40:35 +01:00
example.env refactor/elixir backend 2019-07-14 11:47:06 +00:00
LICENSE use AGPL (#32) 2019-02-20 13:45:09 +00:00
README.md add example crontab 2019-03-08 16:25:53 +00:00
screenshot.png Replace screenshot.png 2019-04-17 09:52:10 +00:00

fediverse.space 🌐

The map of the fediverse that you always wanted.

A screenshot of fediverse.space

Requirements

  • For everything:
    • Docker
    • Docker-compose
  • For the scraper + API:
    • Python 3
  • For laying out the graph:
    • Java
  • For the frontend:
    • Yarn

Running it

Backend

  • cp example.env .env and modify environment variables as required
  • docker-compose build
  • docker-compose up -d django
    • if you don't specify django, it'll also start gephi which should only be run as a regular one-off job
    • to run in production, run caddy rather than django

Frontend

  • cd frontend && yarn install
  • yarn start

Commands

Backend

After running the backend in Docker:

  • docker-compose exec web python manage.py scrape scrapes the fediverse
    • It only scrapes instances that have not been scraped in the last 24 hours.
    • By default, it'll only scrape 50 instances in one go. If you want to scrape everything, pass the --all flag.
  • docker-compose exec web python manage.py build_edges aggregates this information into edges with weights
  • docker-compose run gephi java -Xmx1g -jar build/libs/graphBuilder.jar lays out the graph

To run in production, use docker-compose -f docker-compose.yml -f docker-compose.production.yml instead of just docker-compose.

An example crontab:

# crawl 50 stale instances (plus any newly discovered instances from them)
# the -T flag is important; without it, docker-compose will allocate a tty to the process
15,45 * * * * docker-compose -f docker-compose.yml -f docker-compose.production.yml exec -T django python manage.py scrape
# build the edges based on how much users interact
15 3 * * * docker-compose -f docker-compose.yml -f docker-compose.production.yml exec -T django python manage.py build_edges
# layout the graph
20 3 * * * docker-compose -f docker-compose.yml -f docker-compose.production.yml run gephi java -Xmx1g -jar build/libs/graphBuilder.jar

Frontend

  • yarn build to create an optimized build for deployment