New install startup fix, #build

Changed:
- Fixed not loading new default configs at expected time
- Better startup error handling
pull/437/head
simon 2 years ago
commit 5d8dc76e7a
No known key found for this signature in database
GPG Key ID: 2C15AA5E89985DD4

@ -66,10 +66,10 @@ There's dedicated user-contributed install steps under [docs/Installation.md](./
For minimal system requirements, the Tube Archivist stack needs around 2GB of available memory for a small testing setup and around 4GB of available memory for a mid to large sized installation. Minimal with dual core with 4 threads, better quad core plus. For minimal system requirements, the Tube Archivist stack needs around 2GB of available memory for a small testing setup and around 4GB of available memory for a mid to large sized installation. Minimal with dual core with 4 threads, better quad core plus.
Note for arm64 hosts: The Tube Archivist container is multi arch, so is Elasticsearch. RedisJSON doesn't offer arm builds, but you can use the image `bbilly1/rejson`, an unofficial rebuild for arm64.
This project requires docker. Ensure it is installed and running on your system. This project requires docker. Ensure it is installed and running on your system.
Note for **arm64**: Tube Archivist is a multi arch container, same for redis. For Elasitc Search use the official image for arm64 support. Other architectures are not supported.
Save the [docker-compose.yml](./docker-compose.yml) file from this reposity somewhere permanent on your system, keeping it named `docker-compose.yml`. You'll need to refer to it whenever starting this application. Save the [docker-compose.yml](./docker-compose.yml) file from this reposity somewhere permanent on your system, keeping it named `docker-compose.yml`. You'll need to refer to it whenever starting this application.
Edit the following values from that file: Edit the following values from that file:
@ -153,6 +153,8 @@ Wildcards "*" can not be used for the Access-Control-Allow-Origin header. If the
Use `bbilly1/tubearchivist-es` to automatically get the recommended version, or use the official image with the version tag in the docker-compose file. Use `bbilly1/tubearchivist-es` to automatically get the recommended version, or use the official image with the version tag in the docker-compose file.
Use official Elastic Search for **arm64**.
Stores video meta data and makes everything searchable. Also keeps track of the download queue. Stores video meta data and makes everything searchable. Also keeps track of the download queue.
- Needs to be accessible over the default port `9200` - Needs to be accessible over the default port `9200`
- Needs a volume at **/usr/share/elasticsearch/data** to store data - Needs a volume at **/usr/share/elasticsearch/data** to store data

@ -24,7 +24,7 @@ services:
- archivist-es - archivist-es
- archivist-redis - archivist-redis
archivist-redis: archivist-redis:
image: redislabs/rejson # for arm64 use bbilly1/rejson image: redis/redis-stack-server
container_name: archivist-redis container_name: archivist-redis
restart: unless-stopped restart: unless-stopped
expose: expose:

@ -121,8 +121,8 @@ The field **Refresh older than x days** takes a number where TubeArchivist will
## Thumbnail check ## Thumbnail check
This will check if all expected thumbnails are there and will delete any artwork without matching video. This will check if all expected thumbnails are there and will delete any artwork without matching video.
## Index backup ## ZIP file index backup
Create a zip file of the metadata and select **Max auto backups to keep** to automatically delete old backups created from this task. Create a zip file of the metadata and select **Max auto backups to keep** to automatically delete old backups created from this task. For data consistency, make sure there aren't any other tasks running that will change the index during the backup process. This is very slow, particularly for large archives. Use snapshots instead.
# Actions # Actions
@ -166,8 +166,8 @@ If the video you are trying to import is not available on YouTube any more, **Tu
## Embed thumbnails into media file ## Embed thumbnails into media file
This will write or overwrite all thumbnails in the media file using the downloaded thumbnail. This is only necessary if you didn't download the files with the option *Embed Thumbnail* enabled or want to make sure all media files get the newest thumbnail. Follow the docker-compose logs to monitor progress. This will write or overwrite all thumbnails in the media file using the downloaded thumbnail. This is only necessary if you didn't download the files with the option *Embed Thumbnail* enabled or want to make sure all media files get the newest thumbnail. Follow the docker-compose logs to monitor progress.
## Backup Database ## ZIP file index backup
This will backup your metadata into a zip file. The file will get stored at *cache/backup* and will contain the necessary files to restore the Elasticsearch index formatted **nd-json** files. This will backup your metadata into a zip file. The file will get stored at *cache/backup* and will contain the necessary files to restore the Elasticsearch index formatted **nd-json** files. For data consistency, make sure there aren't any other tasks running that will change the index during the backup process. This is very slow, particularly for large archives.
BE AWARE: This will **not** backup any media files, just the metadata from the Elasticsearch. BE AWARE: This will **not** backup any media files, just the metadata from the Elasticsearch.

@ -168,4 +168,4 @@ class Command(BaseCommand):
message = f" 🗙 {index_name} vid_type update failed" message = f" 🗙 {index_name} vid_type update failed"
self.stdout.write(self.style.ERROR(message)) self.stdout.write(self.style.ERROR(message))
self.stdout.write(response) self.stdout.write(response)
CommandError(message) raise CommandError(message)

@ -32,7 +32,9 @@ SECRET_KEY = PW_HASH.hexdigest()
# SECURITY WARNING: don't run with debug turned on in production! # SECURITY WARNING: don't run with debug turned on in production!
DEBUG = bool(environ.get("DJANGO_DEBUG")) DEBUG = bool(environ.get("DJANGO_DEBUG"))
ALLOWED_HOSTS = [i.strip() for i in environ.get("TA_HOST").split()] ALLOWED_HOSTS = []
if environ.get("TA_HOST"):
ALLOWED_HOSTS = [i.strip() for i in environ.get("TA_HOST").split()]
CSRF_TRUSTED_ORIGINS = [] CSRF_TRUSTED_ORIGINS = []
for host in ALLOWED_HOSTS: for host in ALLOWED_HOSTS:

@ -266,6 +266,7 @@ class ScheduleBuilder:
def build_schedule(self): def build_schedule(self):
"""build schedule dict as expected by app.conf.beat_schedule""" """build schedule dict as expected by app.conf.beat_schedule"""
AppConfig().load_new_defaults() AppConfig().load_new_defaults()
self.config = AppConfig().config
schedule_dict = {} schedule_dict = {}
for schedule_item in self.SCHEDULES: for schedule_item in self.SCHEDULES:

@ -286,8 +286,9 @@
</div> </div>
</div> </div>
<div class="settings-group"> <div class="settings-group">
<h2>Index backup</h2> <h2>ZIP file index backup</h2>
<div class="settings-item"> <div class="settings-item">
<p><i>Zip file backups are very slow for large archives and consistency is not guaranteed, use snapshots instead. Make sure no other tasks are running when creating a Zip file backup.</i></p>
<p>Current index backup schedule: <span class="settings-current"> <p>Current index backup schedule: <span class="settings-current">
{% if config.scheduler.run_backup %} {% if config.scheduler.run_backup %}
{% for key, value in config.scheduler.run_backup.items %} {% for key, value in config.scheduler.run_backup.items %}
@ -332,8 +333,9 @@
</div> </div>
</div> </div>
<div class="settings-group"> <div class="settings-group">
<h2>Backup database</h2> <h2>ZIP file index backup</h2>
<p>Export your database to a zip file stored at <span class="settings-current">cache/backup</span>.</p> <p>Export your database to a zip file stored at <span class="settings-current">cache/backup</span>.</p>
<p><i>Zip file backups are very slow for large archives and consistency is not guaranteed, use snapshots instead. Make sure no other tasks are running when creating a Zip file backup.</i></p>
<div id="db-backup"> <div id="db-backup">
<button onclick="dbBackup()">Start backup</button> <button onclick="dbBackup()">Start backup</button>
</div> </div>

Loading…
Cancel
Save