After an uptime of 109 days, the usage of /var/lib/tor was still 10.9 MB. A
reply in issue #648 reported a higher usage, which was fixed by setting the
size a bit higher (12MB instead of 11MB).
Wikipedia, imgur, and translate alternatives were all still using
hardcoded URLs when replaced with their respective alternative frontend.
This updates them to use farside instead.
Also adds the ability to overwrite the image in docker-compose.yml,
which allows the CI build to use the same image for all docker tests.
The default is still 'benbusby/whoogle-search' though.
* Integrate Farside into Whoogle
When instances are ratelimited (when a captcha is returned instead of
the user's search results) the user can now hop to a new instance via
Farside, a new backend service that redirects users to working instances
of a particular frontend. In this case, it presents a user with a
Farside link to a new Whoogle (or Searx) instance instead, so that the
user can resume their search.
For the generated Farside->Whoogle link, the generated link includes the
user's current Whoogle configuration settings as URL params, to ensure a
more seamless transition between instances. This doesn't translate to
the Farside->Searx link, but potentially could with some changes.
* Expand conversion of config<->url params
Config settings can now be translated to and from URL params using a
predetermined set of "safe" keys (i.e. config settings that easily
translate to URL params).
* Allow jumping instances via Farside when ratelimited
When instances are ratelimited (when a captcha is returned instead of
the user's search results) the user can now hop to a new instance via
Farside, a new backend service that redirects users to working instances
of a particular frontend. In this case, it presents a user with a
Farside link to a new Whoogle (or Searx) instance instead, so that the
user can resume their search.
For the generated Farside->Whoogle link, the generated link includes the
user's current Whoogle configuration settings as URL params, to ensure a
more seamless transition between instances. This doesn't translate to
the Farside->Searx link, but potentially could with some changes.
Closes#554Closes#559
scribe.rip is a privacy respecting front end for medium.com. This
feature allows medium.com results to be replaced with scribe.rip links,
and works for both regular medium.com domains as well as user specific
subdomains (i.e. user.medium.com).
[scribe.rip website](https://scribe.rip)
[scribe.rip source code](https://git.sr.ht/~edwardloveall/scribe)
Co-authored-by: Ben Busby <noreply+git@benbusby.com>
* Add support for Lingva translations in results
Searches that contain the word "translate" and are normal search queries
(i.e. not news/images/video/etc) now create an iframe to a Lingva url to
translate the user's search using their configured search language.
The Lingva url can be configured using the WHOOGLE_ALT_TL env var, or
will fall back to the official Lingva instance url (lingva.ml).
For more info, visit https://github.com/TheDavidDelta/lingva-translate
* Add basic test for lingva results
* Allow user specified lingva instances through csp frame-src
* Fix pep8 issue
The requests library requires both 'http' and 'https' values in any
included proxy dict, and whoogle was previously copying the http proxy
to https for simplicity. The assumption was that if the underlying
request wasn't able to connect via https, it would default to http
(otherwise why have the requirement to specify both?)
This led to connectivity issues for users with http only proxies as of
the latest urllib and requests package versions, which are a lot more
strict with connections over https. With the latest versions, if an
https connection cannot be made, the library returns an error.
As a result, the new proxy dict must look something like this for plain
http proxies:
{'http': 'http://domain.tld:port', 'https': 'http://domain.tld:port'}
where both http and https are identical, but both are still required.
The env_file option is a better way of loading environment variables
from whoogle.env, and has been added to docker-compose.yml.
The whoogle.env file had comments after each example setting, and it was
not clear that these comments needed to be removed when setting new
values. The comments for each variable have been moved above the
appropriate variable to reduce confusion.
See #303 for initial discussion
This allows the user to enable their preferred settings in a variety of
ways, depending on their deployment preference. Values added to
whoogle.env can be enabled using WHOOGLE_DOTENV=1, in which case all
values in the env var file will overwrite defaults or user provided
settings.
Co-authored-by: Ben Busby <benbusby@protonmail.com>
* Adds the ability to redirect reddit.com to libredd.it using the existing
"site alts" config setting.
This adds the WHOOGLE_ALT_RD environment variable for optionally
redirecting reddit links to libreddit
(https://github.com/spikecodes/libreddit).
* Include libreddit in home page site alt note
* Add ability to configure site alts w/ env vars
Site alternatives (i.e. twitter.com -> nitter.net) can now be configured
using environment variables:
WHOOGLE_ALT_TW='nitter.net' # twitter alt
WHOOGLE_ALT_YT='invidio.us' # youtube alt
WHOOGLE_ALT_IG='bibliogram.art/u' # instagram alt
Updated testing to confirm results have been modified.
* Add site alt vars to docker settings and readme
Environment variables should by default be disabled, since they are
optional and need further configuration by the user before enabling.
Readme was updated to reflect this approach, as well as moving the
documentation for the variables a bit lower and properly linking to them
in other areas of the readme.
* Ignore venv when building docker file
* Remove reference to 8888 port
It wasn't really used anywhere, and setting it to 5000 everywhere removes ambiguity, and makes things easier to track and reason about
* Use waitress rather than Flask's built in web server
It's not production grade
* Actually add waitress to requirements
Woops!