mirror of https://github.com/trailofbits/algo
Refactor to support Ansible 2.8 (#1549)
* bump ansible to 2.8.3 * DigitalOcean: move to the latest modules * Add Hetzner Cloud * Scaleway and Lightsail fixes * lint missing roles * Update roles/cloud-hetzner/tasks/main.yml Add api_token Co-Authored-By: phaer <phaer@phaer.org> * Update roles/cloud-hetzner/tasks/main.yml Add api_token Co-Authored-By: phaer <phaer@phaer.org> * Try to run apt until succeeded * Scaleway modules upgrade * GCP: Refactoring, remove deprecated modules * Doc updates (#1552) * Update README.md Adding links and mentions of Exoscale aka CloudStack and Hetzner Cloud. * Update index.md Add the Hetzner Cloud to the docs index * Remove link to Win 10 IPsec instructions * Delete client-windows.md Unnecessary since the deprecation of IPsec for Win10. * Update deploy-from-ansible.md Added sections and required variables for CloudStack and Hetzner Cloud. * Update deploy-from-ansible.md Added sections for CloudStack and Hetzner, added req variables and examples, mentioned environment variables, and added links to the provider role section. * Update deploy-from-ansible.md Cosmetic changes to links, fix typo. * Update GCE variables * Update deploy-from-script-or-cloud-init-to-localhost.md Fix a finer point, and make variables list more readable. * update azure requirements * Python3 draft * set LANG=c to the p12 password generation task * Update README * Install cloud requirements to the existing venv * FreeBSD fix * env->.env fixes * lightsail_region_facts fix * yaml syntax fix * Update README for Python 3 (#1564) * Update README for Python 3 * Remove tabs and tweak instructions * Remove cosmetic command indentation * Update README.md * Update README for Python 3 (#1565) * DO fix for "found unpermitted parameters: id" * Verify Python version * Remove ubuntu 16.04 from readme * Revert back DigitalOcean module * Update deploy-from-script-or-cloud-init-to-localhost.md * env to .envpull/1589/head
parent
61729ac9b5
commit
8bdd99c05d
@ -0,0 +1,37 @@
|
||||
# Linux strongSwan IPsec Clients (e.g., OpenWRT, Ubuntu Server, etc.)
|
||||
|
||||
Install strongSwan, then copy the included ipsec_user.conf, ipsec_user.secrets, user.crt (user certificate), and user.key (private key) files to your client device. These will require customization based on your exact use case. These files were originally generated with a point-to-point OpenWRT-based VPN in mind.
|
||||
|
||||
## Ubuntu Server example
|
||||
|
||||
1. `sudo apt-get install strongswan libstrongswan-standard-plugins`: install strongSwan
|
||||
2. `/etc/ipsec.d/certs`: copy `<name>.crt` from `algo-master/configs/<server_ip>/ipsec/manual/<name>.crt`
|
||||
3. `/etc/ipsec.d/private`: copy `<name>.key` from `algo-master/configs/<server_ip>/ipsec/manual/<name>.key`
|
||||
4. `/etc/ipsec.d/cacerts`: copy `cacert.pem` from `algo-master/configs/<server_ip>/ipsec/manual/cacert.pem`
|
||||
5. `/etc/ipsec.secrets`: add your `user.key` to the list, e.g. `<server_ip> : ECDSA <name>.key`
|
||||
6. `/etc/ipsec.conf`: add the connection from `ipsec_user.conf` and ensure `leftcert` matches the `<name>.crt` filename
|
||||
7. `sudo ipsec restart`: pick up config changes
|
||||
8. `sudo ipsec up <conn-name>`: start the ipsec tunnel
|
||||
9. `sudo ipsec down <conn-name>`: shutdown the ipsec tunnel
|
||||
|
||||
One common use case is to let your server access your local LAN without going through the VPN. Set up a passthrough connection by adding the following to `/etc/ipsec.conf`:
|
||||
|
||||
conn lan-passthrough
|
||||
leftsubnet=192.168.1.1/24 # Replace with your LAN subnet
|
||||
rightsubnet=192.168.1.1/24 # Replace with your LAN subnet
|
||||
authby=never # No authentication necessary
|
||||
type=pass # passthrough
|
||||
auto=route # no need to ipsec up lan-passthrough
|
||||
|
||||
To configure the connection to come up at boot time replace `auto=add` with `auto=start`.
|
||||
|
||||
## Notes on SELinux
|
||||
|
||||
If you use a system with SELinux enabled you might need to set appropriate file contexts:
|
||||
|
||||
````
|
||||
semanage fcontext -a -t ipsec_key_file_t "$(pwd)(/.*)?"
|
||||
restorecon -R -v $(pwd)
|
||||
````
|
||||
|
||||
See [this comment](https://github.com/trailofbits/algo/issues/263#issuecomment-328053950).
|
@ -1,6 +0,0 @@
|
||||
# Windows client setup
|
||||
|
||||
## Installation via profiles
|
||||
|
||||
1. Install the [WireGuard VPN Client](https://www.wireguard.com/install/#windows-7-8-81-10-2012-2016-2019) and start it.
|
||||
2. Import the corresponding `wireguard/<name>.conf` file to your device, then setup a new connection with it.
|
@ -1,115 +0,0 @@
|
||||
# Deploy from Fedora Workstation
|
||||
|
||||
These docs were written based on experience on Fedora Workstation 30.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
### DNF counterparts of apt packages
|
||||
|
||||
The following table lists `apt` packages with their `dnf` counterpart. This is purely informative.
|
||||
Using `python2-*` in favour of `python3-*` as per [declared dependency](https://github.com/trailofbits/algo#deploy-the-algo-server).
|
||||
|
||||
| `apt` | `dnf` |
|
||||
| ----- | ----- |
|
||||
| `build-essential` | `make automake gcc gcc-c++ kernel-devel` |
|
||||
| `libssl-dev` | `openssl-devel` |
|
||||
| `libffi-dev` | `libffi-devel` |
|
||||
| `python-dev` | `python2-devel` |
|
||||
| `python-pip` | `python2-pip` |
|
||||
| `python-setuptools` | `python2-setuptools` |
|
||||
| `python-virtualenv` | `python2-virtualenv` |
|
||||
|
||||
### Install requirements
|
||||
|
||||
First, let's make sure our system is up-to-date:
|
||||
|
||||
````
|
||||
dnf upgrade
|
||||
````
|
||||
|
||||
Next, install the required packages:
|
||||
|
||||
````
|
||||
dnf install -y \
|
||||
ansible \
|
||||
automake \
|
||||
gcc \
|
||||
gcc-c++ \
|
||||
kernel-devel \
|
||||
openssl-devel \
|
||||
libffi-devel \
|
||||
libselinux-python \
|
||||
python2-devel \
|
||||
python2-pip \
|
||||
python2-setuptools \
|
||||
python2-virtualenv \
|
||||
python2-crypto \
|
||||
python2-pyyaml \
|
||||
python2-pyOpenSSL \
|
||||
python2-libselinux \
|
||||
make
|
||||
````
|
||||
|
||||
## Get Algo
|
||||
|
||||
|
||||
[Download](https://github.com/trailofbits/algo/archive/master.zip) or clone:
|
||||
|
||||
````
|
||||
git clone git@github.com:trailofbits/algo.git
|
||||
cd algo
|
||||
````
|
||||
|
||||
If you downloaded Algo, unzip to your prefered location and `cd` into it.
|
||||
We'll assume from this point forward that our working directory is the `algo` root directory.
|
||||
|
||||
|
||||
## Prepare algo
|
||||
|
||||
Some steps are needed before we can deploy our Algo VPN server.
|
||||
|
||||
### Check `pip`
|
||||
|
||||
Run `pip -v` and check the python version it is using:
|
||||
|
||||
````
|
||||
$ pip -V
|
||||
pip 19.0.3 from /usr/lib/python2.7/site-packages (python 2.7)
|
||||
````
|
||||
|
||||
`python 2.7` is what we're looking for.
|
||||
|
||||
### Setup virtualenv and install requirements
|
||||
|
||||
````
|
||||
python2 -m virtualenv --system-site-packages env
|
||||
source env/bin/activate
|
||||
pip -q install --user -r requirements.txt
|
||||
````
|
||||
|
||||
## Configure
|
||||
|
||||
Edit the userlist and any other settings you desire in `config.cfg` using your prefered editor.
|
||||
|
||||
## Deploy
|
||||
|
||||
We can now deploy our server by running:
|
||||
|
||||
````
|
||||
./algo
|
||||
````
|
||||
|
||||
Note the IP and password of the newly created Algo VPN server and store it safely.
|
||||
|
||||
If you want to setup client config on your Fedora Workstation, refer to [the Linux Client docs](client-linux.md).
|
||||
|
||||
## Notes on SELinux
|
||||
|
||||
If you have SELinux enabled, you'll need to set appropriate file contexts:
|
||||
|
||||
````
|
||||
semanage fcontext -a -t ipsec_key_file_t "$(pwd)(/.*)?"
|
||||
restorecon -R -v $(pwd)
|
||||
````
|
||||
|
||||
See [this comment](https://github.com/trailofbits/algo/issues/263#issuecomment-328053950).
|
@ -1,2 +1,2 @@
|
||||
[local]
|
||||
localhost ansible_connection=local ansible_python_interpreter=python
|
||||
localhost ansible_connection=local ansible_python_interpreter=python3
|
||||
|
@ -1,139 +0,0 @@
|
||||
#!/usr/bin/python
|
||||
# Copyright 2013 Google Inc.
|
||||
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
|
||||
from __future__ import absolute_import, division, print_function
|
||||
__metaclass__ = type
|
||||
|
||||
|
||||
ANSIBLE_METADATA = {'metadata_version': '1.1',
|
||||
'status': ['preview'],
|
||||
'supported_by': 'community'}
|
||||
|
||||
|
||||
DOCUMENTATION = '''
|
||||
---
|
||||
module: gce_region_facts
|
||||
version_added: "5.3"
|
||||
short_description: Gather facts about GCE regions.
|
||||
description:
|
||||
- Gather facts about GCE regions.
|
||||
options:
|
||||
service_account_email:
|
||||
version_added: "1.6"
|
||||
description:
|
||||
- service account email
|
||||
required: false
|
||||
default: null
|
||||
aliases: []
|
||||
pem_file:
|
||||
version_added: "1.6"
|
||||
description:
|
||||
- path to the pem file associated with the service account email
|
||||
This option is deprecated. Use 'credentials_file'.
|
||||
required: false
|
||||
default: null
|
||||
aliases: []
|
||||
credentials_file:
|
||||
version_added: "2.1.0"
|
||||
description:
|
||||
- path to the JSON file associated with the service account email
|
||||
required: false
|
||||
default: null
|
||||
aliases: []
|
||||
project_id:
|
||||
version_added: "1.6"
|
||||
description:
|
||||
- your GCE project ID
|
||||
required: false
|
||||
default: null
|
||||
aliases: []
|
||||
requirements:
|
||||
- "python >= 2.6"
|
||||
- "apache-libcloud >= 0.13.3, >= 0.17.0 if using JSON credentials"
|
||||
author: "Jack Ivanov (@jackivanov)"
|
||||
'''
|
||||
|
||||
EXAMPLES = '''
|
||||
# Gather facts about all regions
|
||||
- gce_region_facts:
|
||||
'''
|
||||
|
||||
RETURN = '''
|
||||
regions:
|
||||
returned: on success
|
||||
description: >
|
||||
Each element consists of a dict with all the information related
|
||||
to that region.
|
||||
type: list
|
||||
sample: "[{
|
||||
"name": "asia-east1",
|
||||
"status": "UP",
|
||||
"zones": [
|
||||
{
|
||||
"name": "asia-east1-a",
|
||||
"status": "UP"
|
||||
},
|
||||
{
|
||||
"name": "asia-east1-b",
|
||||
"status": "UP"
|
||||
},
|
||||
{
|
||||
"name": "asia-east1-c",
|
||||
"status": "UP"
|
||||
}
|
||||
]
|
||||
}]"
|
||||
'''
|
||||
try:
|
||||
from libcloud.compute.types import Provider
|
||||
from libcloud.compute.providers import get_driver
|
||||
from libcloud.common.google import GoogleBaseError, QuotaExceededError, ResourceExistsError, ResourceNotFoundError
|
||||
_ = Provider.GCE
|
||||
HAS_LIBCLOUD = True
|
||||
except ImportError:
|
||||
HAS_LIBCLOUD = False
|
||||
|
||||
from ansible.module_utils.basic import AnsibleModule
|
||||
from ansible.module_utils.gce import gce_connect, unexpected_error_msg
|
||||
|
||||
|
||||
def main():
|
||||
module = AnsibleModule(
|
||||
argument_spec=dict(
|
||||
service_account_email=dict(),
|
||||
pem_file=dict(type='path'),
|
||||
credentials_file=dict(type='path'),
|
||||
project_id=dict(),
|
||||
)
|
||||
)
|
||||
|
||||
if not HAS_LIBCLOUD:
|
||||
module.fail_json(msg='libcloud with GCE support (0.17.0+) required for this module')
|
||||
|
||||
gce = gce_connect(module)
|
||||
|
||||
changed = False
|
||||
gce_regions = []
|
||||
|
||||
try:
|
||||
regions = gce.ex_list_regions()
|
||||
for r in regions:
|
||||
gce_region = {}
|
||||
gce_region['name'] = r.name
|
||||
gce_region['status'] = r.status
|
||||
gce_region['zones'] = []
|
||||
for z in r.zones:
|
||||
gce_zone = {}
|
||||
gce_zone['name'] = z.name
|
||||
gce_zone['status'] = z.status
|
||||
gce_region['zones'].append(gce_zone)
|
||||
gce_regions.append(gce_region)
|
||||
json_output = { 'regions': gce_regions }
|
||||
module.exit_json(changed=False, results=json_output)
|
||||
except ResourceNotFoundError:
|
||||
pass
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
@ -0,0 +1,93 @@
|
||||
#!/usr/bin/python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from __future__ import absolute_import, division, print_function
|
||||
|
||||
__metaclass__ = type
|
||||
|
||||
################################################################################
|
||||
# Documentation
|
||||
################################################################################
|
||||
|
||||
ANSIBLE_METADATA = {'metadata_version': '1.1', 'status': ["preview"], 'supported_by': 'community'}
|
||||
|
||||
################################################################################
|
||||
# Imports
|
||||
################################################################################
|
||||
from ansible.module_utils.gcp_utils import navigate_hash, GcpSession, GcpModule, GcpRequest
|
||||
import json
|
||||
|
||||
################################################################################
|
||||
# Main
|
||||
################################################################################
|
||||
|
||||
|
||||
def main():
|
||||
module = GcpModule(argument_spec=dict(filters=dict(type='list', elements='str'), scope=dict(required=True, type='str')))
|
||||
|
||||
if module._name == 'gcp_compute_image_facts':
|
||||
module.deprecate("The 'gcp_compute_image_facts' module has been renamed to 'gcp_compute_regions_info'", version='2.13')
|
||||
|
||||
if not module.params['scopes']:
|
||||
module.params['scopes'] = ['https://www.googleapis.com/auth/compute']
|
||||
|
||||
items = fetch_list(module, collection(module), query_options(module.params['filters']))
|
||||
if items.get('items'):
|
||||
items = items.get('items')
|
||||
else:
|
||||
items = []
|
||||
return_value = {'resources': items}
|
||||
module.exit_json(**return_value)
|
||||
|
||||
|
||||
def collection(module):
|
||||
return "https://www.googleapis.com/compute/v1/projects/{project}/{scope}".format(**module.params)
|
||||
|
||||
|
||||
def fetch_list(module, link, query):
|
||||
auth = GcpSession(module, 'compute')
|
||||
response = auth.get(link, params={'filter': query})
|
||||
return return_if_object(module, response)
|
||||
|
||||
|
||||
def query_options(filters):
|
||||
if not filters:
|
||||
return ''
|
||||
|
||||
if len(filters) == 1:
|
||||
return filters[0]
|
||||
else:
|
||||
queries = []
|
||||
for f in filters:
|
||||
# For multiple queries, all queries should have ()
|
||||
if f[0] != '(' and f[-1] != ')':
|
||||
queries.append("(%s)" % ''.join(f))
|
||||
else:
|
||||
queries.append(f)
|
||||
|
||||
return ' '.join(queries)
|
||||
|
||||
|
||||
def return_if_object(module, response):
|
||||
# If not found, return nothing.
|
||||
if response.status_code == 404:
|
||||
return None
|
||||
|
||||
# If no content, return nothing.
|
||||
if response.status_code == 204:
|
||||
return None
|
||||
|
||||
try:
|
||||
module.raise_for_status(response)
|
||||
result = response.json()
|
||||
except getattr(json.decoder, 'JSONDecodeError', ValueError) as inst:
|
||||
module.fail_json(msg="Invalid JSON response with error: %s" % inst)
|
||||
|
||||
if navigate_hash(result, ['error', 'errors']):
|
||||
module.fail_json(msg=navigate_hash(result, ['error', 'errors']))
|
||||
|
||||
return result
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
@ -1,2 +1,2 @@
|
||||
ansible==2.7.12
|
||||
ansible==2.8.3
|
||||
netaddr
|
||||
|
@ -1,2 +0,0 @@
|
||||
---
|
||||
cloudstack_venv: "{{ playbook_dir }}/configs/.venvs/cloudstack"
|
@ -1,15 +1,8 @@
|
||||
---
|
||||
- name: Clean up the environment
|
||||
file:
|
||||
dest: "{{ cloudstack_venv }}"
|
||||
state: absent
|
||||
when: clean_environment
|
||||
|
||||
- name: Install requirements
|
||||
pip:
|
||||
name:
|
||||
- cs
|
||||
- sshpubkeys
|
||||
state: latest
|
||||
virtualenv: "{{ cloudstack_venv }}"
|
||||
virtualenv_python: python2.7
|
||||
virtualenv_python: python3
|
||||
|
@ -1,2 +0,0 @@
|
||||
---
|
||||
digitalocean_venv: "{{ playbook_dir }}/configs/.venvs/digitalocean"
|
@ -1,105 +1,30 @@
|
||||
---
|
||||
- name: Build python virtual environment
|
||||
import_tasks: venv.yml
|
||||
|
||||
- block:
|
||||
- name: Include prompts
|
||||
import_tasks: prompts.yml
|
||||
|
||||
- name: Set additional facts
|
||||
set_fact:
|
||||
algo_do_region: >-
|
||||
{% if region is defined %}{{ region }}
|
||||
{%- elif _algo_region.user_input %}{{ do_regions[_algo_region.user_input | int -1 ]['slug'] }}
|
||||
{%- else %}{{ do_regions[default_region | int - 1]['slug'] }}{% endif %}
|
||||
public_key: "{{ lookup('file', '{{ SSH_keys.public }}') }}"
|
||||
|
||||
- block:
|
||||
- name: "Delete the existing Algo SSH keys"
|
||||
digital_ocean:
|
||||
state: absent
|
||||
command: ssh
|
||||
api_token: "{{ algo_do_token }}"
|
||||
name: "{{ SSH_keys.comment }}"
|
||||
register: ssh_keys
|
||||
until: not ssh_keys.changed
|
||||
retries: 10
|
||||
delay: 1
|
||||
|
||||
rescue:
|
||||
- name: Collect the fail error
|
||||
digital_ocean:
|
||||
state: absent
|
||||
command: ssh
|
||||
api_token: "{{ algo_do_token }}"
|
||||
name: "{{ SSH_keys.comment }}"
|
||||
register: ssh_keys
|
||||
ignore_errors: yes
|
||||
|
||||
- debug: var=ssh_keys
|
||||
|
||||
- fail:
|
||||
msg: "Please, ensure that your API token is not read-only."
|
||||
|
||||
- name: "Upload the SSH key"
|
||||
digital_ocean:
|
||||
state: present
|
||||
command: ssh
|
||||
ssh_pub_key: "{{ public_key }}"
|
||||
api_token: "{{ algo_do_token }}"
|
||||
name: "{{ SSH_keys.comment }}"
|
||||
register: do_ssh_key
|
||||
|
||||
- name: "Creating a droplet..."
|
||||
digital_ocean:
|
||||
state: present
|
||||
command: droplet
|
||||
name: "{{ algo_server_name }}"
|
||||
region_id: "{{ algo_do_region }}"
|
||||
size_id: "{{ cloud_providers.digitalocean.size }}"
|
||||
image_id: "{{ cloud_providers.digitalocean.image }}"
|
||||
ssh_key_ids: "{{ do_ssh_key.ssh_key.id }}"
|
||||
unique_name: yes
|
||||
api_token: "{{ algo_do_token }}"
|
||||
ipv6: yes
|
||||
register: do
|
||||
|
||||
- set_fact:
|
||||
cloud_instance_ip: "{{ do.droplet.ip_address }}"
|
||||
ansible_ssh_user: root
|
||||
|
||||
- name: Tag the droplet
|
||||
digital_ocean_tag:
|
||||
name: "Environment:Algo"
|
||||
resource_id: "{{ do.droplet.id }}"
|
||||
api_token: "{{ algo_do_token }}"
|
||||
state: present
|
||||
|
||||
- block:
|
||||
- name: "Delete the new Algo SSH key"
|
||||
digital_ocean:
|
||||
state: absent
|
||||
command: ssh
|
||||
api_token: "{{ algo_do_token }}"
|
||||
name: "{{ SSH_keys.comment }}"
|
||||
register: ssh_keys
|
||||
until: not ssh_keys.changed
|
||||
retries: 10
|
||||
delay: 1
|
||||
|
||||
rescue:
|
||||
- name: Collect the fail error
|
||||
digital_ocean:
|
||||
state: absent
|
||||
command: ssh
|
||||
api_token: "{{ algo_do_token }}"
|
||||
name: "{{ SSH_keys.comment }}"
|
||||
register: ssh_keys
|
||||
ignore_errors: yes
|
||||
|
||||
- debug: var=ssh_keys
|
||||
|
||||
- fail:
|
||||
msg: "Please, ensure that your API token is not read-only."
|
||||
environment:
|
||||
PYTHONPATH: "{{ digitalocean_venv }}/lib/python2.7/site-packages/"
|
||||
- name: Include prompts
|
||||
import_tasks: prompts.yml
|
||||
|
||||
- name: "Upload the SSH key"
|
||||
digital_ocean_sshkey:
|
||||
oauth_token: "{{ algo_do_token }}"
|
||||
name: "{{ SSH_keys.comment }}"
|
||||
ssh_pub_key: "{{ lookup('file', '{{ SSH_keys.public }}') }}"
|
||||
register: do_ssh_key
|
||||
|
||||
- name: "Creating a droplet..."
|
||||
digital_ocean_droplet:
|
||||
state: present
|
||||
name: "{{ algo_server_name }}"
|
||||
oauth_token: "{{ algo_do_token }}"
|
||||
size: "{{ cloud_providers.digitalocean.size }}"
|
||||
region: "{{ algo_do_region }}"
|
||||
image: "{{ cloud_providers.digitalocean.image }}"
|
||||
wait_timeout: 300
|
||||
unique_name: true
|
||||
ipv6: true
|
||||
ssh_keys: "{{ do_ssh_key.data.ssh_key.id }}"
|
||||
tags:
|
||||
- Environment:Algo
|
||||
register: digital_ocean_droplet
|
||||
|
||||
- set_fact:
|
||||
cloud_instance_ip: "{{ digital_ocean_droplet.data.ip_address }}"
|
||||
ansible_ssh_user: root
|
||||
|
@ -1,13 +0,0 @@
|
||||
---
|
||||
- name: Clean up the environment
|
||||
file:
|
||||
dest: "{{ digitalocean_venv }}"
|
||||
state: absent
|
||||
when: clean_environment
|
||||
|
||||
- name: Install requirements
|
||||
pip:
|
||||
name: dopy
|
||||
version: 0.3.5
|
||||
virtualenv: "{{ digitalocean_venv }}"
|
||||
virtualenv_python: python2.7
|
@ -1,15 +1,8 @@
|
||||
---
|
||||
- name: Clean up the environment
|
||||
file:
|
||||
dest: "{{ ec2_venv }}"
|
||||
state: absent
|
||||
when: clean_environment
|
||||
|
||||
- name: Install requirements
|
||||
pip:
|
||||
name:
|
||||
- boto>=2.5
|
||||
- boto3
|
||||
state: latest
|
||||
virtualenv: "{{ ec2_venv }}"
|
||||
virtualenv_python: python2.7
|
||||
virtualenv_python: python3
|
||||
|
@ -1,2 +0,0 @@
|
||||
---
|
||||
gce_venv: "{{ playbook_dir }}/configs/.venvs/gce"
|
@ -1,14 +1,8 @@
|
||||
---
|
||||
- name: Clean up the environment
|
||||
file:
|
||||
dest: "{{ gce_venv }}"
|
||||
state: absent
|
||||
when: clean_environment
|
||||
|
||||
- name: Install requirements
|
||||
pip:
|
||||
name:
|
||||
- apache-libcloud
|
||||
- requests>=2.18.4
|
||||
- google-auth>=1.3.0
|
||||
state: latest
|
||||
virtualenv: "{{ gce_venv }}"
|
||||
virtualenv_python: python2.7
|
||||
virtualenv_python: python3
|
||||
|
@ -0,0 +1,31 @@
|
||||
---
|
||||
- name: Build python virtual environment
|
||||
import_tasks: venv.yml
|
||||
|
||||
- name: Include prompts
|
||||
import_tasks: prompts.yml
|
||||
|
||||
- name: Create an ssh key
|
||||
hcloud_ssh_key:
|
||||
name: "algo-{{ 999999 | random(seed=lookup('file', SSH_keys.public)) }}"
|
||||
public_key: "{{ lookup('file', SSH_keys.public) }}"
|
||||
state: present
|
||||
api_token: "{{ algo_hcloud_token }}"
|
||||
register: hcloud_ssh_key
|
||||
|
||||
- name: Create a server...
|
||||
hcloud_server:
|
||||
name: "{{ algo_server_name }}"
|
||||
location: "{{ algo_hcloud_region }}"
|
||||
server_type: "{{ cloud_providers.hetzner.server_type }}"
|
||||
image: "{{ cloud_providers.hetzner.image }}"
|
||||
state: present
|
||||
api_token: "{{ algo_hcloud_token }}"
|
||||
ssh_keys: "{{ hcloud_ssh_key.hcloud_ssh_key.name }}"
|
||||
labels:
|
||||
Environment: algo
|
||||
register: hcloud_server
|
||||
|
||||
- set_fact:
|
||||
cloud_instance_ip: "{{ hcloud_server.hcloud_server.ipv4_address }}"
|
||||
ansible_ssh_user: root
|
@ -0,0 +1,48 @@
|
||||
---
|
||||
- pause:
|
||||
prompt: |
|
||||
Enter your API token (https://trailofbits.github.io/algo/cloud-hetzner.html#api-token):
|
||||
echo: false
|
||||
register: _hcloud_token
|
||||
when:
|
||||
- hcloud_token is undefined
|
||||
- lookup('env','HCLOUD_TOKEN')|length <= 0
|
||||
|
||||
- name: Set the token as a fact
|
||||
set_fact:
|
||||
algo_hcloud_token: "{{ hcloud_token | default(_hcloud_token.user_input|default(None)) | default(lookup('env','HCLOUD_TOKEN'), true) }}"
|
||||
|
||||
- name: Get regions
|
||||
hcloud_datacenter_facts:
|
||||
api_token: "{{ algo_hcloud_token }}"
|
||||
register: _hcloud_regions
|
||||
|
||||
- name: Set facts about thre regions
|
||||
set_fact:
|
||||
hcloud_regions: "{{ hcloud_datacenter_facts | sort(attribute='location') }}"
|
||||
|
||||
- name: Set default region
|
||||
set_fact:
|
||||
default_region: >-
|
||||
{% for r in hcloud_regions %}
|
||||
{%- if r['location'] == "nbg1" %}{{ loop.index }}{% endif %}
|
||||
{%- endfor %}
|
||||
|
||||
- pause:
|
||||
prompt: |
|
||||
What region should the server be located in?
|
||||
{% for r in hcloud_regions %}
|
||||
{{ loop.index }}. {{ r['location'] }} {{ r['description'] }}
|
||||
{% endfor %}
|
||||
|
||||
Enter the number of your desired region
|
||||
[{{ default_region }}]
|
||||
register: _algo_region
|
||||
when: region is undefined
|
||||
|
||||
- name: Set additional facts
|
||||
set_fact:
|
||||
algo_hcloud_region: >-
|
||||
{% if region is defined %}{{ region }}
|
||||
{%- elif _algo_region.user_input %}{{ hcloud_regions[_algo_region.user_input | int -1 ]['location'] }}
|
||||
{%- else %}{{ hcloud_regions[default_region | int - 1]['location'] }}{% endif %}
|
@ -0,0 +1,7 @@
|
||||
---
|
||||
- name: Install requirements
|
||||
pip:
|
||||
name:
|
||||
- hcloud
|
||||
state: latest
|
||||
virtualenv_python: python3
|
@ -1,2 +0,0 @@
|
||||
---
|
||||
lightsail_venv: "{{ playbook_dir }}/configs/.venvs/aws"
|
@ -1,15 +1,8 @@
|
||||
---
|
||||
- name: Clean up the environment
|
||||
file:
|
||||
dest: "{{ lightsail_venv }}"
|
||||
state: absent
|
||||
when: clean_environment
|
||||
|
||||
- name: Install requirements
|
||||
pip:
|
||||
name:
|
||||
- boto>=2.5
|
||||
- boto3
|
||||
state: latest
|
||||
virtualenv: "{{ lightsail_venv }}"
|
||||
virtualenv_python: python2.7
|
||||
virtualenv_python: python3
|
||||
|
@ -1,2 +0,0 @@
|
||||
---
|
||||
openstack_venv: "{{ playbook_dir }}/configs/.venvs/openstack"
|
@ -1,13 +1,6 @@
|
||||
---
|
||||
- name: Clean up the environment
|
||||
file:
|
||||
dest: "{{ openstack_venv }}"
|
||||
state: absent
|
||||
when: clean_environment
|
||||
|
||||
- name: Install requirements
|
||||
pip:
|
||||
name: shade
|
||||
state: latest
|
||||
virtualenv: "{{ openstack_venv }}"
|
||||
virtualenv_python: python2.7
|
||||
virtualenv_python: python3
|
||||
|
Loading…
Reference in New Issue