Content fetch and aggregation bot for hugo data-driven websites
You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
blob42 90e4ba63a9 update readme 1 year ago
bitcoin initial 5 years ago
config initial 5 years ago
db initial 5 years ago
encoder initial 5 years ago
export initial 5 years ago
feeds initial 5 years ago
filters initial 5 years ago
github initial 5 years ago
handlers initial 5 years ago
logging initial 5 years ago
posts initial 5 years ago
static initial 5 years ago
types initial 5 years ago
utils initial 5 years ago
.gitignore initial 5 years ago
Dockerfile initial 5 years ago
Dockerfile-sqliteweb initial 5 years ago
LICENSE add license 5 years ago
Makefile initial 5 years ago
README.md update readme 1 year ago
api.go initial 5 years ago
commands.go initial 5 years ago
config.toml initial 5 years ago
docker-compose.yml initial 5 years ago
docker-entrypoint.sh initial 5 years ago
feed_commands.go initial 5 years ago
go.mod initial 5 years ago
go.sum initial 5 years ago
jobs.go initial 5 years ago
main.go initial 5 years ago
parse_test.go initial 5 years ago
posts_test.go initial 5 years ago
scheduler.go initial 5 years ago
server.go initial 5 years ago

README.md

MIRRORED FROM: https://git.blob42.xyz/sp4ke/hugobot

HUGOBOT

hugobot is a bot that automates the fetching and aggregation of content for Hugo data-driven websites. It has the following features:

Data fetch

  • Use the feeds table to register feeds that will be fetched periodically.
  • Currently, it can handle these types of feeds: RSS, Github Releases, Newsletters
  • To define your own feed types, implement the JobHandler interface (see handlers/handlers.go).
  • Hugobot automatically fetches new posts from the registered feeds.
  • The database uses Sqlite for storage. It has feeds and posts tables.
  • The scheduler can handle an unlimited number of tasks and uses leveldb for caching and resuming jobs.

Hugo export

  • Data is automatically exported to the configured Hugo website path.
  • It can export data as markdown files or json/toml data files.
  • You can customize all fields in the exported files.
  • You can define custom output formats by using the FormatHandler interface.
  • You can register custom filters and post-processing for exported posts to prevent altering the raw data stored in the database.
  • You can force data export using the CLI.

API

  • It uses gin-gonic as the web framework.
  • hugobot also includes a webserver API that can be used with Hugo Data Driven Mode.
  • You can insert and query data from the database. This feature is still a work in progress, but you can easily add the missing code on the API side to automate inserting and querying data from the database.
  • For example, it can be used to automate the generation of Bitcoin addresses for new articles on bitcointechweekly.com.

Other

  • Some commands are available through the CLI (github.com/urfave/cli), you can add your own custom commands.

Sqliteweb interface

  • See the Docker files for more information.

First time usage

  • The first time you run the program, it will automatically generate the database. You can add your feeds to the Sqlite database using your preferred Sqlite GUI.

Contribution

  • We welcome pull requests. Our current priority is adding tests.
  • Check the TODO section.

TODO:

  • Add tests.
  • Handle more feed formats: tweets, mailing-list emails ...
  • TLS support in the API (not a priority, can be done with a reverse proxy).