Content fetch and aggregation bot for hugo data-driven websites
You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
Chakib Benziane 9c27502304 update readme 5 years ago
bitcoin initial 5 years ago
config initial 5 years ago
db initial 5 years ago
encoder initial 5 years ago
export initial 5 years ago
feeds initial 5 years ago
filters initial 5 years ago
github initial 5 years ago
handlers initial 5 years ago
logging initial 5 years ago
posts initial 5 years ago
static initial 5 years ago
types initial 5 years ago
utils initial 5 years ago
.gitignore initial 5 years ago
Dockerfile initial 5 years ago
Dockerfile-sqliteweb initial 5 years ago
LICENSE add license 5 years ago
Makefile initial 5 years ago
README.md update readme 5 years ago
api.go initial 5 years ago
commands.go initial 5 years ago
config.toml initial 5 years ago
docker-compose.yml initial 5 years ago
docker-entrypoint.sh initial 5 years ago
feed_commands.go initial 5 years ago
go.mod initial 5 years ago
go.sum initial 5 years ago
jobs.go initial 5 years ago
main.go initial 5 years ago
parse_test.go initial 5 years ago
posts_test.go initial 5 years ago
scheduler.go initial 5 years ago
server.go initial 5 years ago

README.md

MIRRORED FROM: https://git.sp4ke.com/sp4ke/hugobot

HUGOBOT

hugobot is a an automated content fetch and aggregation bot for Hugo data driven websites. It has the following features:

Data fetch

  • Use the feeds table to register feeds that will periodically get fetched, stored and exported into the hugo project.
  • Currently handles these types of feeds: RSS, Github Releases, Newsletters
  • Define your own feed types by implementing the JobHandler interface (see handlers/handlers.go).
  • Hugobot automatically fetches new posts from the registered.
  • Sqlite is used for storage. feeds and posts tables.
  • The scheduler can handle any number of tasks and uses leveldb for caching/resuming jobs.

Hugo export

  • Data is automatically exported to the configured Hugo website path.
  • It can export markdown files or json/toml data files.
  • All fields in the exported files can be customized.
  • You can define custom output formats by using the FormatHandler interface.
  • You can register custom filters and post processing on exported posts to avoid changing the raw data stored in the db.
  • You can force data export using the CLI.

API

  • Uses gin-gonic.

  • hugobot also includes a webserver API that can be used with Hugo Data Driven Mode.

  • Insert and query data from the db. This is still a WIP, you can easily add the missing code on the API side to automate adding/querying data from the DB.

  • An example usage is the automated generation of Bitcoin addresses for new articles on bitcointechweekly.com

Other

  • Some commands are available through the CLI (github.com/urfave/cli), you can add your own custom commands.

Sqliteweb interface

  • See Docker files

First time usage

  • The database is automatically generated the first time you run the program. You can add your feeds straight into the sqlite db using your favorite sqlite GUI or the provided web gui in the docker-compose file.

Contribution

  • PRs welcome, current priority is to add tests.
  • Check the TODO section.

TODO:

  • Add tests.
  • Handle more feed formats: tweets, mailing-list emails ...
  • TLS support in the API (not a priority, can be done with a reverse proxy).