How to get a joined volume Python Virtual environment to Airflow Docker working with externalpythontask?
# GOAL
​
\- Have a local python environemnt that I can swap up and install things to it
\- withouth needing to build a new image -> stopping the runing container -> starting new container
​
# DONE
​
\- I use the docker version of airflow 2.4.1
\- I have succesfully joined the Python Virtual environment to Airflow Docker as a volume you can see I in the docker-compose.yml
\- After restarting docker with the new yml file it works fine.
\- I can jump in to the container activate manually the python environment import and run python libraries perfectly fine.
​
# CHALLANGE
​
\- The problem comes when I try to run my test dag with the new venv2
\- The DAG works with the original external python environemnt that is installed via the Dockerfile but the goal would be to not to need this as mentioned before
\- My guess is that this error happens because the python environemnt does not activated.
​
# Files and ERRORS
​
docker-compose.yml
​
​
version: '3'
x-airflow-common:
&airflow-common
image: ${AIRFLOWIMAGENAME:-myown-image-apache/airflow:2.4.1}
build: .
environment:
&airflow-common-env
AIRFLOWCOREEXECUTOR: CeleryExecutor
AIRFLOWDATABASESQLALCHEMYCONN: NOTPUBLIC
#ORIGINAL: postgresql+psycopg2://airflow:airflow@postgres/airflow
# For backward compatibility, with Airflow <2.3
AIRFLOWCORESQLALCHEMYCONN: NOTPUBLIC
#ORIGINAL postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOWCELERYRESULTBACKEND: NOTPUBLIC
# ORIGINAL db+postgresql://airflow:airflow@postgres/airflow
AIRFLOWCELERYBROKERURL: redis://:@redis:1111/0
AIRFLOWCOREFERNETKEY: ''
AIRFLOWCOREDAGSAREPAUSEDATCREATION: 'true'
AIRFLOWCORELOADEXAMPLES: 'false'
AIRFLOWAPIAUTHBACKENDS: 'airflow.api.auth.backend.NOTPUBLIC'
PIPADDITIONALREQUIREMENTS: ${PIPADDITIONALREQUIREMENTS:-}
AIRFLOWCOREENABLEXCOMPICKLING: 'NOTPUBLIC'
AIRFLOWSMTPSMTPHOST: NOTPUBLIC
AIRFLOWSMTPSMTPPORT: 222
AIRF LOWSMTPSMTPUSER: "NOTPUBLIC"
AIRFLOWSMTPSMTPPASSWORD: NOTPUBLIC
AIRFLOWSMTPSMTPMAILFROM: [email protected]
AIRFLOWWEBSERVERBASEURL: NOTPUBLIC
AIRFLOWWEBSERVERWEBSERVERSSLCERT: /opt/airflow/certs/NOTPUBLIC.pem
AIRFLOWWEBSERVERWEBSERVERSSLKEY: /opt/airflow/certs/NOTPUBLIC.pem
AIRFLOWCOREMAXACTIVERUNSPERDAG: 1
AIRFLOWCOREDEFAULTTASKEXECUTIONTIMEOUT: 21600
AWSSNOWPLOWACCESSKEY: NOTPUBLIC
AWSSNOWPLOWSECRETKEY: NOTPUBLIC
AIRFLOWSCHEDULERMINFILEPROCESSINTERVAL: 180
#AIRFLOWSCHEDULERDAGDIRLISTINTERVAL: 600
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- routtofolder/NOTPUBLIC1:/opt/airflow/NOTPUBLIC1
- routtofolder/NOTPUBLIC2:/opt/airflow/NOTPUBLIC2
- /routtofolder/NOTPUBLIC3:/opt/airflow/NOTPUBLIC3
- ./venv2:/opt/airflow/venv2 #########################################THIS IS THE PROBLEMATIC PART
user: "${AIRFLOWUID:-50000}:0"
dependson:
&airflow-common-depends-on
redis:
condition: servicehealthy
postgres:
condition: servicehealthy
​
​
my example DAG that I want to work:
​
from future import annotations
import logging
import sys
import tempfile
from pprint import pprint
from datetime import timedelta
import pendulum
from airflow import DAG
from airflow.decorators import task
from airflow.operators.pythonoperator import PythonOperator
from airflow.models import Variable
import requests
from requests.auth
# GOAL
​
\- Have a local python environemnt that I can swap up and install things to it
\- withouth needing to build a new image -> stopping the runing container -> starting new container
​
# DONE
​
\- I use the docker version of airflow 2.4.1
\- I have succesfully joined the Python Virtual environment to Airflow Docker as a volume you can see I in the docker-compose.yml
\- After restarting docker with the new yml file it works fine.
\- I can jump in to the container activate manually the python environment import and run python libraries perfectly fine.
​
# CHALLANGE
​
\- The problem comes when I try to run my test dag with the new venv2
\- The DAG works with the original external python environemnt that is installed via the Dockerfile but the goal would be to not to need this as mentioned before
\- My guess is that this error happens because the python environemnt does not activated.
​
# Files and ERRORS
​
docker-compose.yml
​
​
version: '3'
x-airflow-common:
&airflow-common
image: ${AIRFLOWIMAGENAME:-myown-image-apache/airflow:2.4.1}
build: .
environment:
&airflow-common-env
AIRFLOWCOREEXECUTOR: CeleryExecutor
AIRFLOWDATABASESQLALCHEMYCONN: NOTPUBLIC
#ORIGINAL: postgresql+psycopg2://airflow:airflow@postgres/airflow
# For backward compatibility, with Airflow <2.3
AIRFLOWCORESQLALCHEMYCONN: NOTPUBLIC
#ORIGINAL postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOWCELERYRESULTBACKEND: NOTPUBLIC
# ORIGINAL db+postgresql://airflow:airflow@postgres/airflow
AIRFLOWCELERYBROKERURL: redis://:@redis:1111/0
AIRFLOWCOREFERNETKEY: ''
AIRFLOWCOREDAGSAREPAUSEDATCREATION: 'true'
AIRFLOWCORELOADEXAMPLES: 'false'
AIRFLOWAPIAUTHBACKENDS: 'airflow.api.auth.backend.NOTPUBLIC'
PIPADDITIONALREQUIREMENTS: ${PIPADDITIONALREQUIREMENTS:-}
AIRFLOWCOREENABLEXCOMPICKLING: 'NOTPUBLIC'
AIRFLOWSMTPSMTPHOST: NOTPUBLIC
AIRFLOWSMTPSMTPPORT: 222
AIRF LOWSMTPSMTPUSER: "NOTPUBLIC"
AIRFLOWSMTPSMTPPASSWORD: NOTPUBLIC
AIRFLOWSMTPSMTPMAILFROM: [email protected]
AIRFLOWWEBSERVERBASEURL: NOTPUBLIC
AIRFLOWWEBSERVERWEBSERVERSSLCERT: /opt/airflow/certs/NOTPUBLIC.pem
AIRFLOWWEBSERVERWEBSERVERSSLKEY: /opt/airflow/certs/NOTPUBLIC.pem
AIRFLOWCOREMAXACTIVERUNSPERDAG: 1
AIRFLOWCOREDEFAULTTASKEXECUTIONTIMEOUT: 21600
AWSSNOWPLOWACCESSKEY: NOTPUBLIC
AWSSNOWPLOWSECRETKEY: NOTPUBLIC
AIRFLOWSCHEDULERMINFILEPROCESSINTERVAL: 180
#AIRFLOWSCHEDULERDAGDIRLISTINTERVAL: 600
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- routtofolder/NOTPUBLIC1:/opt/airflow/NOTPUBLIC1
- routtofolder/NOTPUBLIC2:/opt/airflow/NOTPUBLIC2
- /routtofolder/NOTPUBLIC3:/opt/airflow/NOTPUBLIC3
- ./venv2:/opt/airflow/venv2 #########################################THIS IS THE PROBLEMATIC PART
user: "${AIRFLOWUID:-50000}:0"
dependson:
&airflow-common-depends-on
redis:
condition: servicehealthy
postgres:
condition: servicehealthy
​
​
my example DAG that I want to work:
​
from future import annotations
import logging
import sys
import tempfile
from pprint import pprint
from datetime import timedelta
import pendulum
from airflow import DAG
from airflow.decorators import task
from airflow.operators.pythonoperator import PythonOperator
from airflow.models import Variable
import requests
from requests.auth
How to get a joined volume Python Virtual environment to Airflow Docker working with external_python_task?
# GOAL
​
\- Have a local python environemnt that I can swap up and install things to it
\- withouth needing to build a new image -> stopping the runing container -> starting new container
​
# DONE
​
\- I use the docker version of airflow 2.4.1
\- I have succesfully joined the Python Virtual environment to Airflow Docker as a volume you can see I in the docker-compose.yml
\- After restarting docker with the new yml file it works fine.
\- I can jump in to the container activate manually the python environment import and run python libraries perfectly fine.
​
# CHALLANGE
​
\- The problem comes when I try to run my test dag with the new venv2
\- The DAG works with the original external python environemnt that is installed via the Dockerfile but the goal would be to not to need this as mentioned before
\- My guess is that this error happens because the python environemnt does not activated.
​
# Files and ERRORS
​
docker-compose.yml
​
​
version: '3'
x-airflow-common:
&airflow-common
image: ${AIRFLOW_IMAGE_NAME:-myown-image-apache/airflow:2.4.1}
build: .
environment:
&airflow-common-env
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: NOTPUBLIC
#ORIGINAL: postgresql+psycopg2://airflow:airflow@postgres/airflow
# For backward compatibility, with Airflow <2.3
AIRFLOW__CORE__SQL_ALCHEMY_CONN: NOTPUBLIC
#ORIGINAL postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND: NOTPUBLIC
# ORIGINAL db+postgresql://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:1111/0
AIRFLOW__CORE__FERNET_KEY: ''
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
AIRFLOW__API__AUTH_BACKENDS: 'airflow.api.auth.backend.NOTPUBLIC'
_PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
AIRFLOW__CORE__ENABLE_XCOM_PICKLING: 'NOTPUBLIC'
AIRFLOW__SMTP__SMTP_HOST: NOTPUBLIC
AIRFLOW__SMTP__SMTP_PORT: 222
AIRF LOW__SMTP__SMTP_USER: "NOTPUBLIC"
AIRFLOW__SMTP__SMTP_PASSWORD: NOTPUBLIC
AIRFLOW__SMTP__SMTP_MAIL_FROM: [email protected]
AIRFLOW__WEBSERVER__BASE_URL: NOTPUBLIC
AIRFLOW__WEBSERVER__WEB_SERVER_SSL_CERT: /opt/airflow/certs/NOTPUBLIC.pem
AIRFLOW__WEBSERVER__WEB_SERVER_SSL_KEY: /opt/airflow/certs/NOTPUBLIC.pem
AIRFLOW__CORE__MAX_ACTIVE_RUNS_PER_DAG: 1
AIRFLOW__CORE__DEFAULT_TASK_EXECUTION_TIMEOUT: 21600
AWS_SNOWPLOW_ACCESS_KEY: NOTPUBLIC
AWS_SNOWPLOW_SECRET_KEY: NOTPUBLIC
AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL: 180
#AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL: 600
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- routtofolder/NOTPUBLIC1:/opt/airflow/NOTPUBLIC1
- routtofolder/NOTPUBLIC2:/opt/airflow/NOTPUBLIC2
- /routtofolder/NOTPUBLIC3:/opt/airflow/NOTPUBLIC3
- ./venv2:/opt/airflow/venv2 #########################################THIS IS THE PROBLEMATIC PART
user: "${AIRFLOW_UID:-50000}:0"
depends_on:
&airflow-common-depends-on
redis:
condition: service_healthy
postgres:
condition: service_healthy
​
​
my example DAG that I want to work:
​
from __future__ import annotations
import logging
import sys
import tempfile
from pprint import pprint
from datetime import timedelta
import pendulum
from airflow import DAG
from airflow.decorators import task
from airflow.operators.python_operator import PythonOperator
from airflow.models import Variable
import requests
from requests.auth
# GOAL
​
\- Have a local python environemnt that I can swap up and install things to it
\- withouth needing to build a new image -> stopping the runing container -> starting new container
​
# DONE
​
\- I use the docker version of airflow 2.4.1
\- I have succesfully joined the Python Virtual environment to Airflow Docker as a volume you can see I in the docker-compose.yml
\- After restarting docker with the new yml file it works fine.
\- I can jump in to the container activate manually the python environment import and run python libraries perfectly fine.
​
# CHALLANGE
​
\- The problem comes when I try to run my test dag with the new venv2
\- The DAG works with the original external python environemnt that is installed via the Dockerfile but the goal would be to not to need this as mentioned before
\- My guess is that this error happens because the python environemnt does not activated.
​
# Files and ERRORS
​
docker-compose.yml
​
​
version: '3'
x-airflow-common:
&airflow-common
image: ${AIRFLOW_IMAGE_NAME:-myown-image-apache/airflow:2.4.1}
build: .
environment:
&airflow-common-env
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: NOTPUBLIC
#ORIGINAL: postgresql+psycopg2://airflow:airflow@postgres/airflow
# For backward compatibility, with Airflow <2.3
AIRFLOW__CORE__SQL_ALCHEMY_CONN: NOTPUBLIC
#ORIGINAL postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND: NOTPUBLIC
# ORIGINAL db+postgresql://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:1111/0
AIRFLOW__CORE__FERNET_KEY: ''
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
AIRFLOW__API__AUTH_BACKENDS: 'airflow.api.auth.backend.NOTPUBLIC'
_PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
AIRFLOW__CORE__ENABLE_XCOM_PICKLING: 'NOTPUBLIC'
AIRFLOW__SMTP__SMTP_HOST: NOTPUBLIC
AIRFLOW__SMTP__SMTP_PORT: 222
AIRF LOW__SMTP__SMTP_USER: "NOTPUBLIC"
AIRFLOW__SMTP__SMTP_PASSWORD: NOTPUBLIC
AIRFLOW__SMTP__SMTP_MAIL_FROM: [email protected]
AIRFLOW__WEBSERVER__BASE_URL: NOTPUBLIC
AIRFLOW__WEBSERVER__WEB_SERVER_SSL_CERT: /opt/airflow/certs/NOTPUBLIC.pem
AIRFLOW__WEBSERVER__WEB_SERVER_SSL_KEY: /opt/airflow/certs/NOTPUBLIC.pem
AIRFLOW__CORE__MAX_ACTIVE_RUNS_PER_DAG: 1
AIRFLOW__CORE__DEFAULT_TASK_EXECUTION_TIMEOUT: 21600
AWS_SNOWPLOW_ACCESS_KEY: NOTPUBLIC
AWS_SNOWPLOW_SECRET_KEY: NOTPUBLIC
AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL: 180
#AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL: 600
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- routtofolder/NOTPUBLIC1:/opt/airflow/NOTPUBLIC1
- routtofolder/NOTPUBLIC2:/opt/airflow/NOTPUBLIC2
- /routtofolder/NOTPUBLIC3:/opt/airflow/NOTPUBLIC3
- ./venv2:/opt/airflow/venv2 #########################################THIS IS THE PROBLEMATIC PART
user: "${AIRFLOW_UID:-50000}:0"
depends_on:
&airflow-common-depends-on
redis:
condition: service_healthy
postgres:
condition: service_healthy
​
​
my example DAG that I want to work:
​
from __future__ import annotations
import logging
import sys
import tempfile
from pprint import pprint
from datetime import timedelta
import pendulum
from airflow import DAG
from airflow.decorators import task
from airflow.operators.python_operator import PythonOperator
from airflow.models import Variable
import requests
from requests.auth
import HTTPBasicAuth
my_default_args = {
'owner': 'Anonymus',
'email': ['[email protected]'],
'email_on_failure': True,
'email_on_retry': False,
}
with DAG(
dag_id='test_connected_env',
schedule='10 10 * * *',
start_date=pendulum.datetime(2021, 1, 1, tz="UTC"),
catchup=False,
#execution_timeout=timedelta(seconds=60),
default_args=my_default_args,
tags=['sample_tag', 'sample_tag2'],
) as dag:
#@task.external_python(task_id="test_external_python_venv_task", python=os.fspath(sys.executable)) # ORIGINAL
#@task.external_python(task_id="test_connected_env_task", python='/opt/airflow/venv1/bin/python3') ### installed via pip via Dockerfile, this works perfectly fine
u/task.external_python(task_id="test_connected_env_task", python='/opt/airflow/venv2/bin/python3')
def go(): # this could be any function name
#import package here
print("My Start")
#if you want to test the error
# print(1+"Airflow")
import pandas as pd
print(pd.DataFrame({'a':[1,2,3], 'b':[4,5,6]}))
import numpy as np
print(np.array([1,2,3]))
return print('my end')
external_python_task = go()
​
​
​
ERROR that I get:
​
*** Reading local file: /opt/airflow/logs/dag_id=test_connected_env/run_id=manual__2023-03-02T14:15:16.674123+00:00/task_id=test_connected_env_task/attempt=1.log
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1165} INFO - Dependencies all met for <TaskInstance: test_connected_env.test_connected_env_task manual__2023-03-02T14:15:16.674123+00:00 [queued]>
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1165} INFO - Dependencies all met for <TaskInstance: test_connected_env.test_connected_env_task manual__2023-03-02T14:15:16.674123+00:00 [queued]>
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1362} INFO -
--------------------------------------------------------------------------------
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1363} INFO - Starting attempt 1 of 1
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1364} INFO -
--------------------------------------------------------------------------------
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1383} INFO - Executing <Task(_PythonExternalDecoratedOperator): test_connected_env_task> on 2023-03-02 14:15:16.674123+00:00
[2023-03-02, 14:15:18 GMT] {standard_task_runner.py:54} INFO - Started process 15812 to run task
[2023-03-02, 14:15:18 GMT] {standard_task_runner.py:82} INFO - Running: ['airflow', 'tasks', 'run', 'test_connected_env', 'test_connected_env_task', 'manual__2023-03-02T14:15:16.674123+00:00', '--job-id', '142443', '--raw', '--subdir', 'DAGS_FOLDER/test_connected_env_task.py', '--cfg-path', '/tmp/tmp1t0wy5hy']
[2023-03-02, 14:15:18 GMT] {standard_task_runner.py:83} INFO - Job 142443: Subtask test_connected_env_task
[2023-03-02, 14:15:18 GMT] {dagbag.py:525} INFO - Filling up the DagBag from /opt/airflow/dags/test_connected_env_task.py
[2023-03-02, 14:15:18 GMT] {task_command.py:384} INFO - Running <TaskInstance: test_connected_env.test_connected_env_task manual__2023-03-02T14:15:16.674123+00:00 [running]> on host 0ad620763627
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1590} INFO - Exporting the following env vars:
[email protected]
AIRFLOW_CTX_DAG_OWNER=Anonymus
AIRFLOW_CTX_DAG_ID=test_connected_env
AIRFLOW_CTX_TASK_ID=test_connected_env_task
AIRFLOW_CTX_EXECUTION_DATE=2023-03-02T14:15:16.674123+00:00
AIRFLOW_CTX_TRY_NUMBER=1
AIRFLOW_CTX_DAG_RUN_ID=manual__2023-03-02T14:15:16.674123+00:00
[2023-03-02, 14:15:18 GMT] {python.py:725} WARNING - When checking for Airflow installed in venv got Command '['/opt/airflow/venv2/bin/python3', '-c', 'from airflow import version; print(version.version)']' returned non-zero exit status
my_default_args = {
'owner': 'Anonymus',
'email': ['[email protected]'],
'email_on_failure': True,
'email_on_retry': False,
}
with DAG(
dag_id='test_connected_env',
schedule='10 10 * * *',
start_date=pendulum.datetime(2021, 1, 1, tz="UTC"),
catchup=False,
#execution_timeout=timedelta(seconds=60),
default_args=my_default_args,
tags=['sample_tag', 'sample_tag2'],
) as dag:
#@task.external_python(task_id="test_external_python_venv_task", python=os.fspath(sys.executable)) # ORIGINAL
#@task.external_python(task_id="test_connected_env_task", python='/opt/airflow/venv1/bin/python3') ### installed via pip via Dockerfile, this works perfectly fine
u/task.external_python(task_id="test_connected_env_task", python='/opt/airflow/venv2/bin/python3')
def go(): # this could be any function name
#import package here
print("My Start")
#if you want to test the error
# print(1+"Airflow")
import pandas as pd
print(pd.DataFrame({'a':[1,2,3], 'b':[4,5,6]}))
import numpy as np
print(np.array([1,2,3]))
return print('my end')
external_python_task = go()
​
​
​
ERROR that I get:
​
*** Reading local file: /opt/airflow/logs/dag_id=test_connected_env/run_id=manual__2023-03-02T14:15:16.674123+00:00/task_id=test_connected_env_task/attempt=1.log
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1165} INFO - Dependencies all met for <TaskInstance: test_connected_env.test_connected_env_task manual__2023-03-02T14:15:16.674123+00:00 [queued]>
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1165} INFO - Dependencies all met for <TaskInstance: test_connected_env.test_connected_env_task manual__2023-03-02T14:15:16.674123+00:00 [queued]>
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1362} INFO -
--------------------------------------------------------------------------------
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1363} INFO - Starting attempt 1 of 1
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1364} INFO -
--------------------------------------------------------------------------------
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1383} INFO - Executing <Task(_PythonExternalDecoratedOperator): test_connected_env_task> on 2023-03-02 14:15:16.674123+00:00
[2023-03-02, 14:15:18 GMT] {standard_task_runner.py:54} INFO - Started process 15812 to run task
[2023-03-02, 14:15:18 GMT] {standard_task_runner.py:82} INFO - Running: ['airflow', 'tasks', 'run', 'test_connected_env', 'test_connected_env_task', 'manual__2023-03-02T14:15:16.674123+00:00', '--job-id', '142443', '--raw', '--subdir', 'DAGS_FOLDER/test_connected_env_task.py', '--cfg-path', '/tmp/tmp1t0wy5hy']
[2023-03-02, 14:15:18 GMT] {standard_task_runner.py:83} INFO - Job 142443: Subtask test_connected_env_task
[2023-03-02, 14:15:18 GMT] {dagbag.py:525} INFO - Filling up the DagBag from /opt/airflow/dags/test_connected_env_task.py
[2023-03-02, 14:15:18 GMT] {task_command.py:384} INFO - Running <TaskInstance: test_connected_env.test_connected_env_task manual__2023-03-02T14:15:16.674123+00:00 [running]> on host 0ad620763627
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1590} INFO - Exporting the following env vars:
[email protected]
AIRFLOW_CTX_DAG_OWNER=Anonymus
AIRFLOW_CTX_DAG_ID=test_connected_env
AIRFLOW_CTX_TASK_ID=test_connected_env_task
AIRFLOW_CTX_EXECUTION_DATE=2023-03-02T14:15:16.674123+00:00
AIRFLOW_CTX_TRY_NUMBER=1
AIRFLOW_CTX_DAG_RUN_ID=manual__2023-03-02T14:15:16.674123+00:00
[2023-03-02, 14:15:18 GMT] {python.py:725} WARNING - When checking for Airflow installed in venv got Command '['/opt/airflow/venv2/bin/python3', '-c', 'from airflow import version; print(version.version)']' returned non-zero exit status
1.
[2023-03-02, 14:15:18 GMT] {python.py:726} WARNING - This means that Airflow is not properly installed by /opt/airflow/venv2/bin/python3. Airflow context keys will not be available. Please Install Airflow 2.4.1 in your environment to access them.
[2023-03-02, 14:15:18 GMT] {process_utils.py:179} INFO - Executing cmd: /opt/airflow/venv2/bin/python3 /tmp/tmdqmf6q9rg/script.py /tmp/tmdqmf6q9rg/script.in /tmp/tmdqmf6q9rg/script.out /tmp/tmdqmf6q9rg/string_args.txt
[2023-03-02, 14:15:18 GMT] {process_utils.py:183} INFO - Output:
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - My Start
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - Traceback (most recent call last):
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - File "/tmp/tmdqmf6q9rg/script.py", line 38, in <module>
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - res = go(*arg_dict["args"], **arg_dict["kwargs"])
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - File "/tmp/tmdqmf6q9rg/script.py", line 30, in go
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - import pandas as pd
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - ModuleNotFoundError: No module named 'pandas'
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1851} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/decorators/base.py", line 188, in execute
return_value = super().execute(context)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 370, in execute
return super().execute(context=serializable_context)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 175, in execute
return_value = self.execute_callable()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 678, in execute_callable
return self._execute_python_callable_in_subprocess(python_path, tmp_path)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 426, in _execute_python_callable_in_subprocess
execute_in_subprocess(
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/process_utils.py", line 168, in execute_in_subprocess
execute_in_subprocess_with_kwargs(cmd, cwd=cwd)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/process_utils.py", line 191, in execute_in_subprocess_with_kwargs
raise subprocess.CalledProcessError(exit_code, cmd)
subprocess.CalledProcessError: Command '['/opt/airflow/venv2/bin/python3', '/tmp/tmdqmf6q9rg/script.py', '/tmp/tmdqmf6q9rg/script.in', '/tmp/tmdqmf6q9rg/script.out', '/tmp/tmdqmf6q9rg/string_args.txt']' returned non-zero exit status 1.
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1401} INFO - Marking task as FAILED. dag_id=test_connected_env, task_id=test_connected_env_task, execution_date=20230302T141516, start_date=20230302T141518, end_date=20230302T141518
[2023-03-02, 14:15:18 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/email.py:120: RemovedInAirflow3Warning: Fetching SMTP credentials from configuration variables will be deprecated in a future release. Please set credentials using a connection instead.
send_mime_email(e_from=mail_from, e_to=recipients, mime_msg=msg, conn_id=conn_id, dryrun=dryrun)
[2023-03-02, 14:15:18 GMT] {email.py:229} INFO - Email alerting: attempt 1
[2023-03-02, 14:15:18 GMT] {email.py:241} INFO - Sent an alert email to ['[email protected]']
[2023-03-02, 14:15:18 GMT] {standard_task_runner.py:102} ERROR - Failed to execute job NONPUBLIC for task test_connected_env_task (Command '['/opt/airflow/venv2/bin/python3', '/tmp/tmdqmf6q9rg/script.py', '/tmp/tmdqmf6q9rg/script.in', '/tmp/tmdqmf6q9rg/script.out', '/tmp/tmdqmf6q9rg/string_args.txt']' returned non-zero exit status 1.; 15812)
[2023-03-02, 14:15:18 GMT]
[2023-03-02, 14:15:18 GMT] {python.py:726} WARNING - This means that Airflow is not properly installed by /opt/airflow/venv2/bin/python3. Airflow context keys will not be available. Please Install Airflow 2.4.1 in your environment to access them.
[2023-03-02, 14:15:18 GMT] {process_utils.py:179} INFO - Executing cmd: /opt/airflow/venv2/bin/python3 /tmp/tmdqmf6q9rg/script.py /tmp/tmdqmf6q9rg/script.in /tmp/tmdqmf6q9rg/script.out /tmp/tmdqmf6q9rg/string_args.txt
[2023-03-02, 14:15:18 GMT] {process_utils.py:183} INFO - Output:
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - My Start
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - Traceback (most recent call last):
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - File "/tmp/tmdqmf6q9rg/script.py", line 38, in <module>
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - res = go(*arg_dict["args"], **arg_dict["kwargs"])
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - File "/tmp/tmdqmf6q9rg/script.py", line 30, in go
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - import pandas as pd
[2023-03-02, 14:15:18 GMT] {process_utils.py:187} INFO - ModuleNotFoundError: No module named 'pandas'
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1851} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/decorators/base.py", line 188, in execute
return_value = super().execute(context)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 370, in execute
return super().execute(context=serializable_context)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 175, in execute
return_value = self.execute_callable()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 678, in execute_callable
return self._execute_python_callable_in_subprocess(python_path, tmp_path)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", line 426, in _execute_python_callable_in_subprocess
execute_in_subprocess(
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/process_utils.py", line 168, in execute_in_subprocess
execute_in_subprocess_with_kwargs(cmd, cwd=cwd)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/process_utils.py", line 191, in execute_in_subprocess_with_kwargs
raise subprocess.CalledProcessError(exit_code, cmd)
subprocess.CalledProcessError: Command '['/opt/airflow/venv2/bin/python3', '/tmp/tmdqmf6q9rg/script.py', '/tmp/tmdqmf6q9rg/script.in', '/tmp/tmdqmf6q9rg/script.out', '/tmp/tmdqmf6q9rg/string_args.txt']' returned non-zero exit status 1.
[2023-03-02, 14:15:18 GMT] {taskinstance.py:1401} INFO - Marking task as FAILED. dag_id=test_connected_env, task_id=test_connected_env_task, execution_date=20230302T141516, start_date=20230302T141518, end_date=20230302T141518
[2023-03-02, 14:15:18 GMT] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/utils/email.py:120: RemovedInAirflow3Warning: Fetching SMTP credentials from configuration variables will be deprecated in a future release. Please set credentials using a connection instead.
send_mime_email(e_from=mail_from, e_to=recipients, mime_msg=msg, conn_id=conn_id, dryrun=dryrun)
[2023-03-02, 14:15:18 GMT] {email.py:229} INFO - Email alerting: attempt 1
[2023-03-02, 14:15:18 GMT] {email.py:241} INFO - Sent an alert email to ['[email protected]']
[2023-03-02, 14:15:18 GMT] {standard_task_runner.py:102} ERROR - Failed to execute job NONPUBLIC for task test_connected_env_task (Command '['/opt/airflow/venv2/bin/python3', '/tmp/tmdqmf6q9rg/script.py', '/tmp/tmdqmf6q9rg/script.in', '/tmp/tmdqmf6q9rg/script.out', '/tmp/tmdqmf6q9rg/string_args.txt']' returned non-zero exit status 1.; 15812)
[2023-03-02, 14:15:18 GMT]
{local_task_job.py:164} INFO - Task exited with return code 1
[2023-03-02, 14:15:18 GMT] {local_task_job.py:273} INFO - 0 downstream tasks scheduled from follow-on schedule check
https://redd.it/11g5j1e
@r_devops
[2023-03-02, 14:15:18 GMT] {local_task_job.py:273} INFO - 0 downstream tasks scheduled from follow-on schedule check
https://redd.it/11g5j1e
@r_devops
Reddit
r/devops on Reddit: How to get a joined volume Python Virtual environment to Airflow Docker working with external_python_task?
Posted by u/glassAlloy - No votes and no comments
Devops age restrictions
Hello guys, do employees look for employees in 20s 30s to work as devops Or can you move up to dev ops in your 40s 50s ? What your expiernece been like.
https://redd.it/11g5ng9
@r_devops
Hello guys, do employees look for employees in 20s 30s to work as devops Or can you move up to dev ops in your 40s 50s ? What your expiernece been like.
https://redd.it/11g5ng9
@r_devops
Reddit
r/devops on Reddit: Devops age restrictions
Posted by u/Titanguru7 - No votes and 8 comments
S3 bucket lifecycle policy
Can we test the S3 bucket lifecycle policy in a dry run immediately?
I have a policy that deletes files older than 30 days. I want to test it immediately to ensure that the policy works Is it possible?
https://redd.it/11g7ffl
@r_devops
Can we test the S3 bucket lifecycle policy in a dry run immediately?
I have a policy that deletes files older than 30 days. I want to test it immediately to ensure that the policy works Is it possible?
https://redd.it/11g7ffl
@r_devops
Reddit
r/devops on Reddit: S3 bucket lifecycle policy
Posted by u/anacondaonline - No votes and 3 comments
Help Me Upgrade My DevOps/K8S Communities?
I remember the good old days where I could sit in an IRC channel with like minded people and have open conversations about topics related to the channels.
There was no marketing, blog post links, just a bunch of people with similar interests hanging out and periodically discussing relevant things.
I keep seeking to replicate this experience, but I find that almost anything labeled with DevOps or K8S has become just a marketing channel for weak marketing content, posts for webinars, and basically trying to force content to me rather than engage in discussions.
For example, just take a look at the #devops feed on Twitter - it feels like miles of billboards for devops companies.
LinkedIn is similarly flooded with links to marketing content that contains just enough buzzwords to catch the Google bots, but to leave the reader feeling more lost than ever.
Are there any good Slack/Discord/IRC communities?
Dev.to used to be appealing but now I feel like it's just a lot of useless content as well.
I'd love to join something and contribute my experiences as well as help others where I can!
Would you please share your top DevOps/K8S communities with me?
https://redd.it/11di9fy
@r_devops
I remember the good old days where I could sit in an IRC channel with like minded people and have open conversations about topics related to the channels.
There was no marketing, blog post links, just a bunch of people with similar interests hanging out and periodically discussing relevant things.
I keep seeking to replicate this experience, but I find that almost anything labeled with DevOps or K8S has become just a marketing channel for weak marketing content, posts for webinars, and basically trying to force content to me rather than engage in discussions.
For example, just take a look at the #devops feed on Twitter - it feels like miles of billboards for devops companies.
LinkedIn is similarly flooded with links to marketing content that contains just enough buzzwords to catch the Google bots, but to leave the reader feeling more lost than ever.
Are there any good Slack/Discord/IRC communities?
Dev.to used to be appealing but now I feel like it's just a lot of useless content as well.
I'd love to join something and contribute my experiences as well as help others where I can!
Would you please share your top DevOps/K8S communities with me?
https://redd.it/11di9fy
@r_devops
DEV Community
A space to discuss and keep up software development and manage your software career
Is GitLab Premium worth it at its new price?
Per GitLab's blog, GitLab Premium is getting a price hike from $19/user/month to $29/user/month, with a "transitional" price of $24/user/month.
Their article talks about all the features they've added to Premium since 2018, but I feel like the company has changed significantly since it went public. We've seen no movement on any of the features we care about - mainly related to packaging - and I can't remember the last time that Premium actually got a new feature we appreciated.
Am I alone in feeling like its value is running a bit thin at this price? Do competing products provide a better value than GitLab Premium at its new price?
https://redd.it/11gadwc
@r_devops
Per GitLab's blog, GitLab Premium is getting a price hike from $19/user/month to $29/user/month, with a "transitional" price of $24/user/month.
Their article talks about all the features they've added to Premium since 2018, but I feel like the company has changed significantly since it went public. We've seen no movement on any of the features we care about - mainly related to packaging - and I can't remember the last time that Premium actually got a new feature we appreciated.
Am I alone in feeling like its value is running a bit thin at this price? Do competing products provide a better value than GitLab Premium at its new price?
https://redd.it/11gadwc
@r_devops
about.gitlab.com
New pricing for GitLab Premium | GitLab
Learn more about the GitLab Premium updates.
Best way to provision a vm in hyper v?
The current approach looks like the following.
Pipeline agent builds a golden image with Packer and uploads it to an image repository. Then Terraform connects via winrm to the hyper v host, downloads the prepared .vhdx file from the image repository and creates the vm with the image attached to it.
Any suggestions to improve this process? I am not sure if something like WDS or Powershell DSC will be better to provision the VMs.
https://redd.it/11gcvey
@r_devops
The current approach looks like the following.
Pipeline agent builds a golden image with Packer and uploads it to an image repository. Then Terraform connects via winrm to the hyper v host, downloads the prepared .vhdx file from the image repository and creates the vm with the image attached to it.
Any suggestions to improve this process? I am not sure if something like WDS or Powershell DSC will be better to provision the VMs.
https://redd.it/11gcvey
@r_devops
Reddit
r/devops on Reddit: Best way to provision a vm in hyper v?
Posted by u/Ok-Mine-6491 - No votes and no comments
How to exempt github actions bot from the rules of the protected brances
I developed a workflow with github actions that for each push builds some stuff and it is supposed to push the changes automatically. This worked fine until the workflow had to be merged on a branch which was protected. and did not allow pushing without a pull request first. Is there any way I could allow to bot to carry on with the push, any workaround?
I did find an action that creates a pull request automatically to the branch you push to, from another branch that this action creates (and then deletes). So I could use this with the 2 default branches in my repo and everytime something pushes into those, the action creates a separate branch and generates a PR from there. But this was not the approach I wanted because I wanted this thing to get built on every push.
Is there really no workaround on how to exempt github actions bot from being counted in the protection rules of the branches? I know we can exept users in the organization, but I really dont want to add a user as the author only for this reason.
https://redd.it/11gdhxx
@r_devops
I developed a workflow with github actions that for each push builds some stuff and it is supposed to push the changes automatically. This worked fine until the workflow had to be merged on a branch which was protected. and did not allow pushing without a pull request first. Is there any way I could allow to bot to carry on with the push, any workaround?
I did find an action that creates a pull request automatically to the branch you push to, from another branch that this action creates (and then deletes). So I could use this with the 2 default branches in my repo and everytime something pushes into those, the action creates a separate branch and generates a PR from there. But this was not the approach I wanted because I wanted this thing to get built on every push.
Is there really no workaround on how to exempt github actions bot from being counted in the protection rules of the branches? I know we can exept users in the organization, but I really dont want to add a user as the author only for this reason.
https://redd.it/11gdhxx
@r_devops
Reddit
r/devops on Reddit: How to exempt github actions bot from the rules of the protected brances
Posted by u/Acrobatic-Ad-6556 - No votes and 1 comment
What is the best open-source CI/CD platform?
I've used Gitlab in the past, but that is not open-source to the best of my understanding. And I don't like Jenkins and had a lot of confusion around installing CircleCI, any other suggestions?
https://redd.it/11gez33
@r_devops
I've used Gitlab in the past, but that is not open-source to the best of my understanding. And I don't like Jenkins and had a lot of confusion around installing CircleCI, any other suggestions?
https://redd.it/11gez33
@r_devops
Reddit
r/devops on Reddit: What is the best open-source CI/CD platform?
Posted by u/pred135 - No votes and 2 comments
KodeKloud or ACloudGuru
Hi, I need hands on training on Docker, Kubernetes/Kubectl, Jenkins, Terraform, Azure and possibly soon AWS since my company is about to migrate into it. I'm not chasing any certifications and I'm not looking to be a pro in a short mean of time. I just need practical knowledge of general DevOps toolchain. Would you say KodeKloud has better designed hands on labs than ACloudGuru? I brought these two because they are also offering a discount in their plans right now and they seem to be most popular ones also. I see majority of SE people on Reddit preferes KodeKloud but just wanted to get a direct recommendation for a learning platform like these? Thanks a lot!
https://redd.it/11gbhtj
@r_devops
Hi, I need hands on training on Docker, Kubernetes/Kubectl, Jenkins, Terraform, Azure and possibly soon AWS since my company is about to migrate into it. I'm not chasing any certifications and I'm not looking to be a pro in a short mean of time. I just need practical knowledge of general DevOps toolchain. Would you say KodeKloud has better designed hands on labs than ACloudGuru? I brought these two because they are also offering a discount in their plans right now and they seem to be most popular ones also. I see majority of SE people on Reddit preferes KodeKloud but just wanted to get a direct recommendation for a learning platform like these? Thanks a lot!
https://redd.it/11gbhtj
@r_devops
Reddit
r/devops on Reddit: KodeKloud or ACloudGuru
Posted by u/davegurney2 - No votes and 6 comments
DevOps challenges - what do you think? can you contribute ?
Hi people, today I bring to us interest topic (at least for me). Surely, sometime in your career you had concern about side/pet projects or "how can I increase my technical skills?". So, like a frontend engineer cloning a Twitter or some social medial, what kind of challenges you consider interest for DevOps engineer? I mean, in my case I think something like that:
Put an API in some git-based source code repository hosting (BitBucket, GitHub, GitLab)
Create a Pipeline with some CI/CD tool (Bitbucket pipelines, GitHub Actions, Jenkins)
Setup a VM in some cloud provider (AWS, Azure, GCP)
Deploy this API in the VM mentioned above
and so on..
have you seen some github repository with this challenges?
can you share some challenge that you consider interesting?
greetings :)
https://redd.it/11glzlq
@r_devops
Hi people, today I bring to us interest topic (at least for me). Surely, sometime in your career you had concern about side/pet projects or "how can I increase my technical skills?". So, like a frontend engineer cloning a Twitter or some social medial, what kind of challenges you consider interest for DevOps engineer? I mean, in my case I think something like that:
Put an API in some git-based source code repository hosting (BitBucket, GitHub, GitLab)
Create a Pipeline with some CI/CD tool (Bitbucket pipelines, GitHub Actions, Jenkins)
Setup a VM in some cloud provider (AWS, Azure, GCP)
Deploy this API in the VM mentioned above
and so on..
have you seen some github repository with this challenges?
can you share some challenge that you consider interesting?
greetings :)
https://redd.it/11glzlq
@r_devops
Reddit
r/devops on Reddit: DevOps challenges - what do you think? can you contribute ?
Posted by u/FernandoJaimes - No votes and no comments
Infrastructure as Code tools? Open poll
Choosing IaC tools can be daunting these days for a company interested in going there - LOTS of tools to choose from, aside from cloud or on prem.
When someone says "Infrastructure as Code" to you, what are the top 5 or so tools you think of?
https://redd.it/11gb0mz
@r_devops
Choosing IaC tools can be daunting these days for a company interested in going there - LOTS of tools to choose from, aside from cloud or on prem.
When someone says "Infrastructure as Code" to you, what are the top 5 or so tools you think of?
https://redd.it/11gb0mz
@r_devops
Reddit
r/devops on Reddit: Infrastructure as Code tools? Open poll
Posted by u/NHGuy - No votes and 16 comments
Cloud engineer, like a fish out of water casual chat
Back in the day, I used to be a mediocre wizard with VBA and formulas in my non-software engineering job. People thought I was the bomb dot com! But then I switched over to QA automation (1yr) and now cloud infra (~1yr) , and it's a wake-up call. It's like being a technician or machinist - it's tough! Context switching, lack of domain knowledge, poor debugging tools, and sparse documentation can be draining. i'm not even working on critical tasks but can create critical problems ;)
I'm not one to back down from a challenge. I've learned so much about Kubernetes, GitLab, AWS, GCP, Helm, Terraform, debugging, breaking things, Flux, Linux, and networking (it's always the DNS!!) - you name it! But the learning never stops, and often it's like my brain can't handle any more information. Forget more than I remember. I like creating fitting existing cloud resource terraform modules into our infra, as it's bite-sized and I can adapt it to our needs.
Working with a good team has definitely helped, but lately, I feel like the senior is getting fed up with me. I mean, I'm not doing poorly, but I have questions and I think out loud better. Plus, I have good ideas that just need to be bounced around a bit. It's tough to collaborate over Slack, and we don't video pair that often anymore. Everyone is busy. Things to be done.
In my old engineering jobs, I could just walk over to someone and pick their brain. But in this virtual world, it's not so easy. That siad my boss is happy with my progress, or I'd really be stressing out. lIke many, we had a huge round of layoffs and I'm happy I was kept.
Overall, it's been a wild ride, but I'm excited to see where this journey takes me! I can't say this is for ever. I talked my way into this role to learn more about tech instead of going to school but I do prefer data analytics more.
Let's be real - transitioning to a technical role can be tough, but it's also incredibly rewarding (??). I may feel like a fish out of water, but I'm proud of how much I've learned and accomplished so far. And let's not forget about the paycheque - I'm not rolling in dough, but I'm certainly not complaining. So, what do you do with your hard-earned cash? Invest in stocks? Buy a boat? Splurge on fancy dinners? blow? Or do you save it all for a rainy day? Let's talk about the serious and not-so-serious sides of working in tech, including salaries and all the fun things we do with our money. What motivated you to pursue this? Passion for technology, a desire for a challenge, or something else entirely? girls?? What was your learning curve like over they years?
https://redd.it/11gazsm
@r_devops
Back in the day, I used to be a mediocre wizard with VBA and formulas in my non-software engineering job. People thought I was the bomb dot com! But then I switched over to QA automation (1yr) and now cloud infra (~1yr) , and it's a wake-up call. It's like being a technician or machinist - it's tough! Context switching, lack of domain knowledge, poor debugging tools, and sparse documentation can be draining. i'm not even working on critical tasks but can create critical problems ;)
I'm not one to back down from a challenge. I've learned so much about Kubernetes, GitLab, AWS, GCP, Helm, Terraform, debugging, breaking things, Flux, Linux, and networking (it's always the DNS!!) - you name it! But the learning never stops, and often it's like my brain can't handle any more information. Forget more than I remember. I like creating fitting existing cloud resource terraform modules into our infra, as it's bite-sized and I can adapt it to our needs.
Working with a good team has definitely helped, but lately, I feel like the senior is getting fed up with me. I mean, I'm not doing poorly, but I have questions and I think out loud better. Plus, I have good ideas that just need to be bounced around a bit. It's tough to collaborate over Slack, and we don't video pair that often anymore. Everyone is busy. Things to be done.
In my old engineering jobs, I could just walk over to someone and pick their brain. But in this virtual world, it's not so easy. That siad my boss is happy with my progress, or I'd really be stressing out. lIke many, we had a huge round of layoffs and I'm happy I was kept.
Overall, it's been a wild ride, but I'm excited to see where this journey takes me! I can't say this is for ever. I talked my way into this role to learn more about tech instead of going to school but I do prefer data analytics more.
Let's be real - transitioning to a technical role can be tough, but it's also incredibly rewarding (??). I may feel like a fish out of water, but I'm proud of how much I've learned and accomplished so far. And let's not forget about the paycheque - I'm not rolling in dough, but I'm certainly not complaining. So, what do you do with your hard-earned cash? Invest in stocks? Buy a boat? Splurge on fancy dinners? blow? Or do you save it all for a rainy day? Let's talk about the serious and not-so-serious sides of working in tech, including salaries and all the fun things we do with our money. What motivated you to pursue this? Passion for technology, a desire for a challenge, or something else entirely? girls?? What was your learning curve like over they years?
https://redd.it/11gazsm
@r_devops
Reddit
r/devops on Reddit: Cloud engineer, like a fish out of water [casual chat]
Posted by u/D675vroom - No votes and 3 comments
Devops with no devs?
I’m a sysadmin working for a company that doesn’t do any internal development, more that we just consume. Every piece of software in the org outside a few web apps are 3rd party. App support team basically is just software usage experts, and when any real issues come up they refer to the vendor for support. Some of these vendors offer modern apps but mainly the workflow for us is like this: company needs something to do xyz, find vendor for xyz, purchase software, spin up infrastructure, install.
How can I as a sysadmin incorporate some of the devops methodologies and tools when we’re basically there just to setup and maintain OS and below? As we build workloads in azure I like the thought of maintaining repos for infrastructure, networking and policy as code but to what benefit can a sysadmin even do this when even in azure, workloads are still software installed on OS? It’s not like we have any repos to monitor to trigger infra deployment pipelines…
Is it possible to containerize any app that is currently an installed software? Is that even something the ops team / sysadmins who have previously been only OS and below should take on?
I’ve worked with git for personal projects, and have deployed workloads in azure with terraform, but I want to really dig in to the automation possibilities. Am I just so fresh to all this that I’m asking stupid questions and it all should be obvious?
https://redd.it/11gpxje
@r_devops
I’m a sysadmin working for a company that doesn’t do any internal development, more that we just consume. Every piece of software in the org outside a few web apps are 3rd party. App support team basically is just software usage experts, and when any real issues come up they refer to the vendor for support. Some of these vendors offer modern apps but mainly the workflow for us is like this: company needs something to do xyz, find vendor for xyz, purchase software, spin up infrastructure, install.
How can I as a sysadmin incorporate some of the devops methodologies and tools when we’re basically there just to setup and maintain OS and below? As we build workloads in azure I like the thought of maintaining repos for infrastructure, networking and policy as code but to what benefit can a sysadmin even do this when even in azure, workloads are still software installed on OS? It’s not like we have any repos to monitor to trigger infra deployment pipelines…
Is it possible to containerize any app that is currently an installed software? Is that even something the ops team / sysadmins who have previously been only OS and below should take on?
I’ve worked with git for personal projects, and have deployed workloads in azure with terraform, but I want to really dig in to the automation possibilities. Am I just so fresh to all this that I’m asking stupid questions and it all should be obvious?
https://redd.it/11gpxje
@r_devops
Reddit
r/devops on Reddit: Devops with no devs?
Posted by u/shattterbox - No votes and no comments
Question: Content for DevSecOps company flyer
Hello, I'm working as a DevOps with some security background in a small company; as I'm the only female on the team (soon to be another), I got assigned the task of doing a company flyer.
What would be something that would catch your eye on a flyer?
My current ideas are:
Cloud competency (all different that we are using // supporting)
Security competency (I worked on several projects where we helped companies achieve PCI DSS or iso9001/27001)
All the different technologies that we use
Projects that we did
Only requirement that I got is that it needs to be a one-page flyer.
Happy to hear your opinion!!
Ps. sorry, English is not my first language :)
https://redd.it/11dfo5m
@r_devops
Hello, I'm working as a DevOps with some security background in a small company; as I'm the only female on the team (soon to be another), I got assigned the task of doing a company flyer.
What would be something that would catch your eye on a flyer?
My current ideas are:
Cloud competency (all different that we are using // supporting)
Security competency (I worked on several projects where we helped companies achieve PCI DSS or iso9001/27001)
All the different technologies that we use
Projects that we did
Only requirement that I got is that it needs to be a one-page flyer.
Happy to hear your opinion!!
Ps. sorry, English is not my first language :)
https://redd.it/11dfo5m
@r_devops
Reddit
r/devops on Reddit: Question: Content for DevSecOps company flyer
Posted by u/Fearless-Weather6427 - 1 vote and no comments
Title: Seeking Suggestions for DevOps Internship Topics, Focused on Performance Testing and Delivery
Hello everyone,
I am excited to begin my upcoming DevOps internship and am currently seeking suggestions on internship topics, specifically related to performance testing and delivery. As an intern, I will be working with a team of experienced DevOps professionals, and I want to ensure that I am working on something that is both challenging and meaningful.
I would appreciate any suggestions on topics related to performance testing and delivery, such as tools and methodologies that are commonly used in the industry. Additionally, any tips or advice on how to effectively conduct performance testing and ensure smooth delivery of applications would be greatly appreciated.
Thank you in advance for your help!
https://redd.it/11dd1mf
@r_devops
Hello everyone,
I am excited to begin my upcoming DevOps internship and am currently seeking suggestions on internship topics, specifically related to performance testing and delivery. As an intern, I will be working with a team of experienced DevOps professionals, and I want to ensure that I am working on something that is both challenging and meaningful.
I would appreciate any suggestions on topics related to performance testing and delivery, such as tools and methodologies that are commonly used in the industry. Additionally, any tips or advice on how to effectively conduct performance testing and ensure smooth delivery of applications would be greatly appreciated.
Thank you in advance for your help!
https://redd.it/11dd1mf
@r_devops
Reddit
r/devops on Reddit: Title: Seeking Suggestions for DevOps Internship Topics, Focused on Performance Testing and Delivery
Posted by u/ElgrandeOthman - 1 vote and no comments
Jira - Azure DevOps automation
Hi all,
We use Jira with Azure Repositories and Pipelines. I've been reading on this sub that in some organizations, devs just have to do a single PR and everything after that is handled automatic, up to production deploys.
I would like to implement something similar for my team but I'm not sure what tools are available to acomplish this and how to handle merge conflicts in the most automated way. There is a Jira - AZDO plugin created by Atlassian and MS but that has not been updated in 4 years.
I could also write a 'bot' that would act as a bridge between the 2 but perhaps there's a better (faster?) way to do it?
Best,
Simo
https://redd.it/11gsuf9
@r_devops
Hi all,
We use Jira with Azure Repositories and Pipelines. I've been reading on this sub that in some organizations, devs just have to do a single PR and everything after that is handled automatic, up to production deploys.
I would like to implement something similar for my team but I'm not sure what tools are available to acomplish this and how to handle merge conflicts in the most automated way. There is a Jira - AZDO plugin created by Atlassian and MS but that has not been updated in 4 years.
I could also write a 'bot' that would act as a bridge between the 2 but perhaps there's a better (faster?) way to do it?
Best,
Simo
https://redd.it/11gsuf9
@r_devops
Reddit
r/devops on Reddit: Jira - Azure DevOps automation
Posted by u/savornicesei - No votes and 1 comment
S3 delete files
I have some folders in my S3 bucket that start with 'myfolder-xxxxxxxx' . xxxxxx being the timestamp. I want to delete any files that are older than one day in these folders. I'm considering using the S3 lifecycle policy, but I'm not sure which prefix to use ('myfolder' or 'myfolder*'). Can someone please advise me on the best approach?
https://redd.it/11gso3z
@r_devops
I have some folders in my S3 bucket that start with 'myfolder-xxxxxxxx' . xxxxxx being the timestamp. I want to delete any files that are older than one day in these folders. I'm considering using the S3 lifecycle policy, but I'm not sure which prefix to use ('myfolder' or 'myfolder*'). Can someone please advise me on the best approach?
https://redd.it/11gso3z
@r_devops
Reddit
r/devops on Reddit: S3 delete files
Posted by u/anacondaonline - No votes and 1 comment