There are a few conventional choices but this one should be friendly
and generally accepted by local reader.
Previous version is still comprehensible but lesser users (perhaps
used in Japanese documents) and may give local users a pause.
Regular commits are all built and publish to TestPyPI, tagged commits
are published to PyPI.
This should finish the process of moving away from Travis CI, now that
both testing and PyPI deployments are handled in github actions.
Restricting form-action to 'self' in the content security policy
prevented Chrome (and likely other browsers) from using !bangs on the
home page.
Fixes#408
The Travis CI folks are updating stuff and broke my tests, so I'm moving
over to github actions instead since that is (hopefully) less likely to
change moving forward.
Will need to move PyPi deployment to github actions as well.
Since Docker Hub no longer allows automated builds for free tier users,
the build process for new images needs to be moved to GitHub Actions.
The existing buildx workflow has worked pretty well for the most part,
but was only enabled for the develop branch and only pushed the
buildx-experimental tag. This addition allows pushes to the main branch
to build updates for the "latest" tag as well, which is more commonly
used I think.
* Make replit install all requirements first
This should install all requirements from requirements.txt. It makes this a one click experience, without the user having to run `pip install -r requirements.txt` and then tap the run button. I myself had to first run that command in my repl, so I have made this change so others don't have to do the same.
repl.it also runs on linux based systems, so `&&` is the correct bash syntax.
* Running in Bash
I applied the same change I made on onBoot to the run variable, and made the language bash as the syntax `./` and `&&` belong to bash.
Previously if a result element marked for collapsing didn't have a valid
"parent" element, the collapsing was skipped altogether. This loops
through child elements until a valid parent is found (or if one isn't
found, the element will not be collapsed).
On app init, short hashes are generated from file checksums to use for
cache busting. These hashes are added into the full file name and used
to symlink to the actual file contents. These symlinks are loaded in the
jinja templates for each page, and can tell the browser to load a new
file if the hash changes.
This is only in place for css and js files, but can be extended in the
future for other file types if needed.
Introduces a new config element and environment variable
(WHOOGLE_CONFIG_THEME) for setting the theme of the app. Rather than
just having either light or dark, this allows a user to have their
instance use their current system light/dark preference to determine the
theme to use.
As a result, the dark mode setting (and WHOOGLE_CONFIG_DARK) have been
deprecated, but will still work as expected until a system theme has
been chosen.
Sections such as "People also asked" and "related searches" typically
take up a lot of room on the results page, and don't always have the
most useful information. This checks for result elements with more than
7 child divs, extracts the section title, and wraps all elements in a
"details" element that can be expanded/collapsed by the user.
Note that this functionality existed previously (albeit not implemented
as well), but due to changes in how Google returns searches (switching
from using <h2> elements for section headers to <span> or <div>
elements), the approach to collapsing these sections needed to be
updated.
* Add support for Lingva translations in results
Searches that contain the word "translate" and are normal search queries
(i.e. not news/images/video/etc) now create an iframe to a Lingva url to
translate the user's search using their configured search language.
The Lingva url can be configured using the WHOOGLE_ALT_TL env var, or
will fall back to the official Lingva instance url (lingva.ml).
For more info, visit https://github.com/TheDavidDelta/lingva-translate
* Add basic test for lingva results
* Allow user specified lingva instances through csp frame-src
* Fix pep8 issue
A recent issue brought up a good point about how the latest changes to
setting default language to english break functionality for bilingual
users. The change was likely not the best solution for users who were
being affected by IP geolocation on their instances -- the right
solution for that would be to configure the interface/search language to
their preference instead.
The requests library requires both 'http' and 'https' values in any
included proxy dict, and whoogle was previously copying the http proxy
to https for simplicity. The assumption was that if the underlying
request wasn't able to connect via https, it would default to http
(otherwise why have the requirement to specify both?)
This led to connectivity issues for users with http only proxies as of
the latest urllib and requests package versions, which are a lot more
strict with connections over https. With the latest versions, if an
https connection cannot be made, the library returns an error.
As a result, the new proxy dict must look something like this for plain
http proxies:
{'http': 'http://domain.tld:port', 'https': 'http://domain.tld:port'}
where both http and https are identical, but both are still required.
Since the interface language defaults to IP geolocation by google, the
default language is now set to english. Still not sure if this is the
best solution, but at least temporarily should clear up some confusion
for users with instances deployed in countries outside of their own.
Also performed some minor cleanup:
- Updated name of strip_blocked_sites to clean_query
- Added clean_query to list of jinja template functions
- Ensured site block list doesn't contain duplicate filters