

It’s the other way around, I need to get long links to the CLI
It’s the other way around, I need to get long links to the CLI
Not when you are working in a CLI
Link shorteners have absolutely been useful when you need to go to a long URL on a entirely different device you can’t copy a link to easily.
I’ve used them to make it easy to get to scripts.
You would likely need to build a NAS with a HBA (Host Bus Adapter). I’m not aware of any low-end NAS systems that support SAS
The UK, one of the worst countries.
The only way his ban evasion will work is if they find a buyer in 90 days
😎 🤝 😎
What about red 40
I have a drive that’s roughly 13 years old, and has around 11 years 80 days of power on time if that says how much my computer is on.
I only restart it when windows updates start fucking with my networking or my audio drives entirely shit the bed.
Grayjay is just plugins for everything, so if you want to add a new platform you can make it
So you moved and got a 83333x improvement just by moving?
What was the time in-between those two?
Would be insane going from 28.8k to 2.4gbps
I have gotten more reliable results from Google than other search engines even if it involves a middle man service that removes the bullshit
I still use Google search without an issue, just de-bullshitted by the whoogle frontend.
Incoming wall of text
Here is my install script to set up Ubuntu since it has a bit of extra steps for privileged ports https://gitlab.meme.beer/-/snippets/1
Docker compose example, note that my config has a shared network with containers in another compose called nginx
to keep traffic inside docker.
name: "gitlab"
services:
gitlab:
image: 'gitlab/gitlab-ce:latest'
#command: update-permissions
restart: always
hostname: 'gitlab.example.com'
environment:
GITLAB_OMNIBUS_CONFIG: |
external_url 'https://gitlab.example.com'
pages_external_url 'https://pages.example.com'
pages_nginx['enable'] = true
pages_nginx['listen_port'] = 6000
pages_nginx['listen_https'] = false
pages_nginx['redirect_http_to_https'] = false
#puma['per_worker_max_memory_mb'] = 2048 # 2GB
gitlab_rails['gitlab_email_from'] = '[email protected]'
gitlab_rails['gitlab_email_display_name'] = 'GitLab'
gitlab_rails['smtp_enable'] = true
gitlab_rails['smtp_address'] = "smtp.sendgrid.net"
gitlab_rails['smtp_port'] = 587
gitlab_rails['smtp_user_name'] = 'apikey'
gitlab_rails['smtp_password'] = '$SENDGRID_API_KEY_HERE'
gitlab_rails['smtp_domain'] = "smtp.sendgrid.net"
gitlab_rails['smtp_authentication'] = "login"
gitlab_rails['smtp_enable_starttls_auto'] = true
gitlab_rails['smtp_tls'] = false
gitlab_rails['gitlab_default_theme'] = 2
gitlab_rails['gitlab_shell_ssh_port'] = 2224
gitlab_rails['gitlab_default_projects_features_container_registry'] = true
gitlab_rails['registry_enabled'] = true
gitlab_rails['registry_api_url'] = 'https://registry.example.com'
gitlab_rails['registry_issuer'] = 'gitlab-issuer'
registry['log_level'] = 'info'
registry_external_url 'https://registry.example.com'
registry_nginx['enable'] = true
registry_nginx['listen_port'] = 5050
registry_nginx['listen_https'] = false
registry_nginx['redirect_http_to_https'] = false
gitlab_shell['log_level'] = 'INFO'
letsencrypt['enable'] = false
nginx['error_log_level'] = 'info'
nginx['listen_https'] = false
#nginx['proxy_protocol'] = true
#nginx['trusted_proxies'] = ["10.0.0.0/8", "172.16.0.0/12", "192.168.0.0/16"]
# Workhorse
gitlab_workhorse['enable'] = true
gitlab_workhorse['ha'] = false
gitlab_workhorse['listen_network'] = "tcp"
gitlab_workhorse['listen_addr'] = "127.0.0.1:8181"
gitlab_workhorse['log_directory'] = "/var/log/gitlab/gitlab-workhorse"
# Errors
# for sentry error logging the GitLab service
#gitlab_rails['sentry_enabled'] = true
#gitlab_rails['sentry_dsn'] = ''
#gitlab_rails['sentry_clientside_dsn'] = ''
#gitlab_rails['sentry_environment'] = 'production'
# Add any other gitlab.rb configuration here, each on its own line
networks:
- nginx
ports:
# gitlab loves https on 443
#- '80:80'
#- '443:443'
- '2224:22'
volumes:
- ./config:/etc/gitlab
- ./logs:/var/log/gitlab
- ./data:/var/opt/gitlab
shm_size: '256m'
#deploy:
# resources:
# limits:
# cpus: '6'
# memory: 12G
# reservations:
# cpus: '4'
# memory: 6G
# disable healthcheck for restoring backup
#healthcheck:
# disable: true
networks:
nginx:
external: true
name: nginx
The VM is a 6 thread 16gb
OS is currently Ubuntu 22.04.5 LTS (cloud image which is lightweight) just running a very simple docker engine install using the script (plus a few other options since I script the install)
The load averages as of this current moment are 0.12, 0.15, 0.10
so not even a full thread is being used.
I let the container run unmetered on the CPU and memory.
I can provide both the compose and my install script (which is on the GitLab instance) if you are curious.
I run GitLab with docker compose and watchtower, all the updates are automated and have never caused any issues for me.
That being said my setup uses about 7-8gb of ram.
Privacy also doesn’t exist when you have the entire website being indexed
I’m also a firm believer in you don’t need to freely give up your data
https://12ft.io/