Questionnaire on Log aggregation and monitoring for University Project
I’m working on a university project, and I’d really appreciate it if you could take a few minutes to answer this questionnaire, thanks. This questionnaire is mainly targeting sysadmins. https://forms.gle/cb7Vg1s8avGSvjJDA
https://redd.it/1hysbxz
@r_devops
I’m working on a university project, and I’d really appreciate it if you could take a few minutes to answer this questionnaire, thanks. This questionnaire is mainly targeting sysadmins. https://forms.gle/cb7Vg1s8avGSvjJDA
https://redd.it/1hysbxz
@r_devops
Google Docs
Insights on Log Aggregation and Monitoring
This questionnaire is part of a university project to gather insights from professionals in system administration and cybersecurity on current practices, challenges, and preferences in log aggregation and monitoring. Thank you for your participation.
AWS internal CI/CD best practices
AWS internal CI/CD best practices
I was setting up my own pipeline and I ran into this article when doing a quick Google search.
It said 80% CPU and 80% MEM on the rollback alarm. What do y'all think? In general, I think fault rate percentage depends on what your overall traffic volume is.
https://redd.it/1hysonc
@r_devops
AWS internal CI/CD best practices
I was setting up my own pipeline and I ran into this article when doing a quick Google search.
It said 80% CPU and 80% MEM on the rollback alarm. What do y'all think? In general, I think fault rate percentage depends on what your overall traffic volume is.
https://redd.it/1hysonc
@r_devops
powderlabs.dev
Powder Labs: Reliable, Small, and Agile Software Firm from Seattle
Powder Labs is a small software firm based in Seattle. We focus on DevOps migration, DevOps optimization, and DevOps as a Service. We build EMR/EHR software with HIPAA-certification capabilities.
Personal projects/homelab learning experience
Hey all, bit of a meta post I guess
I‘ve found that having my own pet projects (CI, k3s cluster, Rancher, development with Angular) has significantly increased my „intuition“ to solve lots of issues in the realm of DevOps topics.
This intuition has transferred really well into day-to-day topics at work, even though the tools we use there are different.
I‘d like to know if you have a homelab setup/personal project and how that has helped you with your skills.
On the other side I‘d also be interested in folks who refuse to do any projects in their spare time and the reason for this mindset.
Idk what my actual point is, I guess I just want to open up a conversation about personal projects vs. strictly work-only
https://redd.it/1hyt7vc
@r_devops
Hey all, bit of a meta post I guess
I‘ve found that having my own pet projects (CI, k3s cluster, Rancher, development with Angular) has significantly increased my „intuition“ to solve lots of issues in the realm of DevOps topics.
This intuition has transferred really well into day-to-day topics at work, even though the tools we use there are different.
I‘d like to know if you have a homelab setup/personal project and how that has helped you with your skills.
On the other side I‘d also be interested in folks who refuse to do any projects in their spare time and the reason for this mindset.
Idk what my actual point is, I guess I just want to open up a conversation about personal projects vs. strictly work-only
https://redd.it/1hyt7vc
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
Devops reading list, what to add ?
I've already read the generally recommended list of DevOps books. What else is worth looking at. I'm more interest in organizational changes and methodology, not so much tools and tech.
* The DevOps handbook
* The Phoenix
* Team Topoligies
* The Unicorn Project
* DORA report
https://redd.it/1hyu78a
@r_devops
I've already read the generally recommended list of DevOps books. What else is worth looking at. I'm more interest in organizational changes and methodology, not so much tools and tech.
* The DevOps handbook
* The Phoenix
* Team Topoligies
* The Unicorn Project
* DORA report
https://redd.it/1hyu78a
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
Ship it podcast discontinued
A couple of weeks ago I stumbled across the ship it podcast only to learn that it's being discontinued. Dang, I liked the content. What could I listen to instead?
https://redd.it/1hysk3q
@r_devops
A couple of weeks ago I stumbled across the ship it podcast only to learn that it's being discontinued. Dang, I liked the content. What could I listen to instead?
https://redd.it/1hysk3q
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
From idea to deployment in 4 hours
Hey guys,
yesterday i've accomplished something i've had in my head a few weeks already and only needed around 4 hours from idea to deployment. Check it out: https://og-img.com
It is completely free with no registration required.
The idea:
I wanted to have some dynamic Opengraph Images (those og:image metadata you see in blog & social media posts) on my own blog to look better when shared via social media.
The dynamic part is in the url, which means if i want to write a blogpost about devops salaries i can easily write my own OpenGraph Image with simply just changing the URL like this: https://og-img.com/Devops%20salaries/og.png
The "Devops%20salaries" part is the dynamic part which results in the final OpenGraph Image.
You can generate every Image you can image, as long as it fits in 100 characters, and embed this into your own blog or social media posts.
Tech stack:
The heavy lifting is done via node.js in the backend + vanilla javascript in the frontend
I decided to dockerize the whole application, especially because i wanted Caddy as a reverse proxy to handle the node.js container traffic.
The stack node.js + caddy is running in a docker compose setup on a mediocre VM from hetzner.
I announced the service on r/webdev and got some huge traffic yesterday, but the stack handled all of this perfecty.
If you have any feedback or questions let me know! <3
https://redd.it/1hywubd
@r_devops
Hey guys,
yesterday i've accomplished something i've had in my head a few weeks already and only needed around 4 hours from idea to deployment. Check it out: https://og-img.com
It is completely free with no registration required.
The idea:
I wanted to have some dynamic Opengraph Images (those og:image metadata you see in blog & social media posts) on my own blog to look better when shared via social media.
The dynamic part is in the url, which means if i want to write a blogpost about devops salaries i can easily write my own OpenGraph Image with simply just changing the URL like this: https://og-img.com/Devops%20salaries/og.png
The "Devops%20salaries" part is the dynamic part which results in the final OpenGraph Image.
You can generate every Image you can image, as long as it fits in 100 characters, and embed this into your own blog or social media posts.
Tech stack:
The heavy lifting is done via node.js in the backend + vanilla javascript in the frontend
I decided to dockerize the whole application, especially because i wanted Caddy as a reverse proxy to handle the node.js container traffic.
The stack node.js + caddy is running in a docker compose setup on a mediocre VM from hetzner.
I announced the service on r/webdev and got some huge traffic yesterday, but the stack handled all of this perfecty.
If you have any feedback or questions let me know! <3
https://redd.it/1hywubd
@r_devops
Og-Img
Free OpenGraph Image Generator
Generate beautiful OpenGraph images for your social media links. Simple, fast, and free to use. Create images that look great when shared on Twitter, Facebook, LinkedIn, and more.
Is there a good static analysis tool that's free and that's better than semgrep?
Is there a good static analysis tool that's free and that's better than semgrep?
https://redd.it/1hz3emt
@r_devops
Is there a good static analysis tool that's free and that's better than semgrep?
https://redd.it/1hz3emt
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
Logging software recommendations
Hey! So I'm looking for a logging software that has a free plan for starting and cheap plans with good retention times. It's for Java apps (specifically Minecraft servers) where I run multiple instances per server type (for example, first server type would be running in 5 instances and second server type would be running in 10 instances). Any recommendations that you can give? I've seen Axiom, I'm currently thinking on that. Or should I use Sentry self-hosted or another product? Thanks in advance.
https://redd.it/1hz0tv6
@r_devops
Hey! So I'm looking for a logging software that has a free plan for starting and cheap plans with good retention times. It's for Java apps (specifically Minecraft servers) where I run multiple instances per server type (for example, first server type would be running in 5 instances and second server type would be running in 10 instances). Any recommendations that you can give? I've seen Axiom, I'm currently thinking on that. Or should I use Sentry self-hosted or another product? Thanks in advance.
https://redd.it/1hz0tv6
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
How to pass environment variables to remote server using GitHub Actions
I have a flask app for which I want to setup CI/CD using GitHub Actions. I am not sure what the best way to pass environment variables for my Postgres and flask containers. Of course I cannot keep a `.env` file in version control.
My approach is to connect to the server through SSH and create a `.env` file and refer to it in my `docker-compose.yml`
Is this bad practice?
name: Deploy Flask App
on:
push:
branches:
- main
jobs:
deploy:
runs-on: ubuntu-latest
environment: production
steps:
- name: Deploy to production server
uses: appleboy/ssh-action@master
with:
host: ${{ secrets.SERVER_HOST }}
username: ${{ secrets.SERVER_USERNAME }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
script: |
mkdir -p /app
git clone https://my_app /app
cd /app
# Create .env file from GitHub secrets
cat > .env.prod << EOL
REDDIT_USER=${{ secrets.REDDIT_USER }}
REDDIT_USER_PASSWORD=${{ secrets.REDDIT_USER_PASSWORD }}
REDDIT_CLIENT_ID=${{ secrets.REDDIT_CLIENT_ID }}
REDDIT_CLIENT_SECRET=${{ secrets.REDDIT_CLIENT_SECRET }}
USER_AGENT=${{ secrets.USER_AGENT }}
POSTGRES_DB=${{ secrets.POSTGRES_DB }}
POSTGRES_USER=${{ secrets.POSTGRES_USER }}
POSTGRES_PASSWORD=${{ secrets.POSTGRES_PASSWORD }}
POSTGRES_HOST=${{ secrets.POSTGRES_HOST }}
POSTGRES_PORT=${{ secrets.POSTGRES_PORT }}
OPENAI_API_KEY=${{ secrets.OPENAI_API_KEY }}
OPENAI_ENCODER=cl100k_base
OPENAI_EMBEDDING_MODEL=text-embedding-3-small
OPENAI_EMBEDDING_MODEL_TOKEN_LIMIT=8191
CHUNK_SIZE=1200
CHUNK_OVERLAP=120
EOL
https://redd.it/1hz6xv7
@r_devops
I have a flask app for which I want to setup CI/CD using GitHub Actions. I am not sure what the best way to pass environment variables for my Postgres and flask containers. Of course I cannot keep a `.env` file in version control.
My approach is to connect to the server through SSH and create a `.env` file and refer to it in my `docker-compose.yml`
Is this bad practice?
name: Deploy Flask App
on:
push:
branches:
- main
jobs:
deploy:
runs-on: ubuntu-latest
environment: production
steps:
- name: Deploy to production server
uses: appleboy/ssh-action@master
with:
host: ${{ secrets.SERVER_HOST }}
username: ${{ secrets.SERVER_USERNAME }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
script: |
mkdir -p /app
git clone https://my_app /app
cd /app
# Create .env file from GitHub secrets
cat > .env.prod << EOL
REDDIT_USER=${{ secrets.REDDIT_USER }}
REDDIT_USER_PASSWORD=${{ secrets.REDDIT_USER_PASSWORD }}
REDDIT_CLIENT_ID=${{ secrets.REDDIT_CLIENT_ID }}
REDDIT_CLIENT_SECRET=${{ secrets.REDDIT_CLIENT_SECRET }}
USER_AGENT=${{ secrets.USER_AGENT }}
POSTGRES_DB=${{ secrets.POSTGRES_DB }}
POSTGRES_USER=${{ secrets.POSTGRES_USER }}
POSTGRES_PASSWORD=${{ secrets.POSTGRES_PASSWORD }}
POSTGRES_HOST=${{ secrets.POSTGRES_HOST }}
POSTGRES_PORT=${{ secrets.POSTGRES_PORT }}
OPENAI_API_KEY=${{ secrets.OPENAI_API_KEY }}
OPENAI_ENCODER=cl100k_base
OPENAI_EMBEDDING_MODEL=text-embedding-3-small
OPENAI_EMBEDDING_MODEL_TOKEN_LIMIT=8191
CHUNK_SIZE=1200
CHUNK_OVERLAP=120
EOL
https://redd.it/1hz6xv7
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
Free solution to having Nx Cloud on our bare metal server?
Hi everyone. I wanted to ask if there is a free solution to having Nx Remote Cache locally on our bare metal server? Basically a free local version of Nx Cloud.
I have a project in my team that uses an Nx monorepo, and we have own bare metal servers.
I want to be able to share the Nx cache with other team members, so it can speed up development and testing.
Is there any solution out there? Even if it doesn't have all the bells and whistles of the paid version?
https://redd.it/1hz8shb
@r_devops
Hi everyone. I wanted to ask if there is a free solution to having Nx Remote Cache locally on our bare metal server? Basically a free local version of Nx Cloud.
I have a project in my team that uses an Nx monorepo, and we have own bare metal servers.
I want to be able to share the Nx cache with other team members, so it can speed up development and testing.
Is there any solution out there? Even if it doesn't have all the bells and whistles of the paid version?
https://redd.it/1hz8shb
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
OpenTofu 1.9 is here with foreach support for providers, here is how to use it.
Last week I decided to try out the foreach and there was not really any clear documentation on how to use it. So after looking through the PRs and some other mentions to figure out how to use it, I threw together a blog post with a complete example.
The TL;DR, so you don't even need to go to the blog is:
Follow on blog post showing some higher complexity use cases: https://dwood.dev/posts/opentofuproviderforeachcomplexexample/
https://redd.it/1hzch9r
@r_devops
Last week I decided to try out the foreach and there was not really any clear documentation on how to use it. So after looking through the PRs and some other mentions to figure out how to use it, I threw together a blog post with a complete example.
The TL;DR, so you don't even need to go to the blog is:
variable "aws_regions" {
type = map(string)
default = {
"global" : "us-east-1",
"backup" : "us-west-2"
}
}
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~>5"
}
}
}
provider "aws" {
for_each = var.aws_regions
alias = "by_region"
region = each.value
}
data "aws_availability_zones" "this" {
for_each = var.aws_regions
provider = aws.by_region[each.key]
}
output "provider_regions" {
value = { for k, v in data.aws_availability_zones.this : k => tolist(v.group_names)[0] }
}
data "aws_availability_zones" "global" {
provider = aws.by_region["global"]
}
output "global_region" {
value = tolist(data.aws_availability_zones.global.group_names)[0]
}
Follow on blog post showing some higher complexity use cases: https://dwood.dev/posts/opentofuproviderforeachcomplexexample/
https://redd.it/1hzch9r
@r_devops
Become a Data Engineer in 2025 (Based on 100 jobs data!)
Happy New Year, everyone! Reposting a combination of 3 of my most upvoted posts last year at the start of the year for those looking to set ambitious career goals in 2025 assuming lot of new people are looking for this info now. After all, there’s no better time to plan your next big leap into Data Engineering!
**1. Top skills in demand -**
I analyzed 100 data engineering job descriptions from Fortune 500 companies to find the most frequently mentioned skills. Here are the top skills in demand:
|**Skill Group**|**Frequency**|**Constituents with Frequency**|
|:-|:-|:-|
|Programming Languages|196|SQL (85), Python (76), Scala (21), Java (14)|
|ETL and Data Pipeline|136|ETL (65), Pipeline (46), Integration (25)|
|Cloud Platforms|85|AWS (45), Azure (26), GCP (14)|
|Data Modeling and Warehousing|83|Data Modeling (40), Warehousing (22), Architecture (21)|
|Big Data Tools|67|Spark (40), Big Data Tools (19), Hadoop (8)|
|DevOps, Version Control and CI/CD|52|Git (14), CI/CD (13), Jenkins (7), Version Control (7), Terraform (6)|
|Data Quality and Governance|42|Data Quality (20), Data Governance (13), Data Validation (9)|
|Data Visualization|23|Data Visualization (11), Tableau (6), Power BI (6)|
|Collaboration and Communication|18|Communication (10), Collaboration (8)|
|API and Microservices|11|API (8), Microservices (3)|
|Machine Learning|10|Machine Learning (7), MLOps (2), AI/ML Model Development (1)|
**2. 4 Month Study Plan -**
**Month 1: Foundations**
* DBMS & SQL: Basics of database concepts, querying, and design.
* Python: Focus on Python essentials, including libraries like Pandas and NumPy.
* Linux: Basic commands and navigation.
* DSA: Data structures and algorithms, especially for big tech roles.
**Month 2: Key Concepts & Tools**
* Data Concepts: Topics such as Data Lake, Data Mart, Fabric, and Mesh.
* Data Governance: Management, security, and ethics in data.
* Spark: Introductory concepts with Apache Spark.
* Distributed Systems: Overview of Hadoop, Hive, and MPP systems.
* Cloud Services: Options such as AWS, GCP, or Azure.
**Month 3: Advanced Topics**
* Orchestration: Basics of workflow orchestration with tools like Apache Airflow.
* Compute: Databricks, Snowflake, or equivalents like AWS EMR.
* Containers: Introduction to Docker and Kubernetes.
* CI/CD: Tools such as Jenkins and SonarQube.
* Streaming: Fundamentals of Kafka.
* ETL/ELT: Tools like dbt and Talend, along with architecture basics.
* Terraform: Code-based infrastructure setup.
**Month 4: Projects & Portfolio**
* Build a project portfolio to showcase skills. Examples include:
* Bank Data Warehouse
* Fraud Detection ETL
* Reddit Review Tracker
* Retail Analytics
* Trip Data Transformation
* YouTube Clone
**3. Certifications**
Note - You don't have do all of these, do 1/2 of AWS or Azure, 1 of Datarbricks or Snowflake, and 1/2 of optional certifications based on your interests. Also I have mentioned resources only for the ones I know - for the ones I haven't attempted/know have left it empty - please add the same in the comments.
|**Certification**|**Coverage**|**Cost (USD)**|**Resource**|
|:-|:-|:-|:-|
|AWS Certified Cloud Practitioner|Basics of AWS Cloud concepts, services, and support.|$100|Stephane Maarek's Udemy courses|
|**AWS Certified Solutions Architect – Associate ⭐**|Designing and deploying scalable systems on AWS.|$150|Stephane Maarek's Udemy courses|
|**AWS Certified Data Engineer – Associate ⭐**|Managing data pipelines, analytics, and ETL workflows on AWS.|$150|Stephane Maarek's Udemy courses, AWS Builder Labs|
|Microsoft Azure Data Fundamentals (DP-900)|Core data concepts and implementation using Azure.|$99|Eshant Garg/Scott Duffy Udemy courses, Coursera prep courses|
|**Microsoft Azure Data Engineer Associate (DP-203) ⭐**|Integrating and transforming data for analytics on Azure.|$165|Eshant Garg/Scott Duffy Udemy courses, Coursera prep courses|
|Databricks Lakehouse Fundamentals|Basics of Databricks Lakehouse architecture and workflows.|Free||
|**Databricks Certified Data
Happy New Year, everyone! Reposting a combination of 3 of my most upvoted posts last year at the start of the year for those looking to set ambitious career goals in 2025 assuming lot of new people are looking for this info now. After all, there’s no better time to plan your next big leap into Data Engineering!
**1. Top skills in demand -**
I analyzed 100 data engineering job descriptions from Fortune 500 companies to find the most frequently mentioned skills. Here are the top skills in demand:
|**Skill Group**|**Frequency**|**Constituents with Frequency**|
|:-|:-|:-|
|Programming Languages|196|SQL (85), Python (76), Scala (21), Java (14)|
|ETL and Data Pipeline|136|ETL (65), Pipeline (46), Integration (25)|
|Cloud Platforms|85|AWS (45), Azure (26), GCP (14)|
|Data Modeling and Warehousing|83|Data Modeling (40), Warehousing (22), Architecture (21)|
|Big Data Tools|67|Spark (40), Big Data Tools (19), Hadoop (8)|
|DevOps, Version Control and CI/CD|52|Git (14), CI/CD (13), Jenkins (7), Version Control (7), Terraform (6)|
|Data Quality and Governance|42|Data Quality (20), Data Governance (13), Data Validation (9)|
|Data Visualization|23|Data Visualization (11), Tableau (6), Power BI (6)|
|Collaboration and Communication|18|Communication (10), Collaboration (8)|
|API and Microservices|11|API (8), Microservices (3)|
|Machine Learning|10|Machine Learning (7), MLOps (2), AI/ML Model Development (1)|
**2. 4 Month Study Plan -**
**Month 1: Foundations**
* DBMS & SQL: Basics of database concepts, querying, and design.
* Python: Focus on Python essentials, including libraries like Pandas and NumPy.
* Linux: Basic commands and navigation.
* DSA: Data structures and algorithms, especially for big tech roles.
**Month 2: Key Concepts & Tools**
* Data Concepts: Topics such as Data Lake, Data Mart, Fabric, and Mesh.
* Data Governance: Management, security, and ethics in data.
* Spark: Introductory concepts with Apache Spark.
* Distributed Systems: Overview of Hadoop, Hive, and MPP systems.
* Cloud Services: Options such as AWS, GCP, or Azure.
**Month 3: Advanced Topics**
* Orchestration: Basics of workflow orchestration with tools like Apache Airflow.
* Compute: Databricks, Snowflake, or equivalents like AWS EMR.
* Containers: Introduction to Docker and Kubernetes.
* CI/CD: Tools such as Jenkins and SonarQube.
* Streaming: Fundamentals of Kafka.
* ETL/ELT: Tools like dbt and Talend, along with architecture basics.
* Terraform: Code-based infrastructure setup.
**Month 4: Projects & Portfolio**
* Build a project portfolio to showcase skills. Examples include:
* Bank Data Warehouse
* Fraud Detection ETL
* Reddit Review Tracker
* Retail Analytics
* Trip Data Transformation
* YouTube Clone
**3. Certifications**
Note - You don't have do all of these, do 1/2 of AWS or Azure, 1 of Datarbricks or Snowflake, and 1/2 of optional certifications based on your interests. Also I have mentioned resources only for the ones I know - for the ones I haven't attempted/know have left it empty - please add the same in the comments.
|**Certification**|**Coverage**|**Cost (USD)**|**Resource**|
|:-|:-|:-|:-|
|AWS Certified Cloud Practitioner|Basics of AWS Cloud concepts, services, and support.|$100|Stephane Maarek's Udemy courses|
|**AWS Certified Solutions Architect – Associate ⭐**|Designing and deploying scalable systems on AWS.|$150|Stephane Maarek's Udemy courses|
|**AWS Certified Data Engineer – Associate ⭐**|Managing data pipelines, analytics, and ETL workflows on AWS.|$150|Stephane Maarek's Udemy courses, AWS Builder Labs|
|Microsoft Azure Data Fundamentals (DP-900)|Core data concepts and implementation using Azure.|$99|Eshant Garg/Scott Duffy Udemy courses, Coursera prep courses|
|**Microsoft Azure Data Engineer Associate (DP-203) ⭐**|Integrating and transforming data for analytics on Azure.|$165|Eshant Garg/Scott Duffy Udemy courses, Coursera prep courses|
|Databricks Lakehouse Fundamentals|Basics of Databricks Lakehouse architecture and workflows.|Free||
|**Databricks Certified Data
Engineer Associate ⭐**|Building ETL pipelines and managing data workflows.|$200|Ankit Mistry's Udemy courses|
|Databricks Certified Data Engineer Professional|Advanced data engineering skills on Databricks platform.|$200||
|**SnowPro Core Certification ⭐**|Foundational knowledge of Snowflake architecture and operations.|$175||
|SnowPro Advanced Certification|Advanced expertise in complex Snowflake solutions and optimizations.|$375||
|SnowPro Advanced: Data Engineer|Data modeling, ETL, and tuning on Snowflake.|$375||
|Astronomer Certification for Apache Airflow Fundamentals|Core Apache Airflow concepts, including DAG authoring and scheduling.|$150|Mark Lamberti's Udemy course|
|Confluent Certified Developer for Apache Kafka|Developing applications with Kafka, architecture, and APIs.|$150||
|dbt Analytics Engineering Certification|Building and maintaining data workflows with dbt.|$200||
|HashiCorp Certified: Terraform Associate|Managing cloud resources using Terraform.|$70||
|Data Management Fundamentals Exam|Core principles: data architecture, governance, and quality.|$311||
|Data Governance Specialty|Best practices for governance, compliance, and data quality.|$311||
Tips to save money on these:
* AWS offers 50% discount on next exam: So after you give your first certification you can use a coupon code for the next ones.
* Azure - Coursera prep courses for Azure certifications offer 50% exam discount upon completion.
* For Airflow Fundamentals - Astronomer sometimes runs a promotion to get the certification for free. Follow them/Marc on LinkedIn to know the dates - I got mine in Jan last year.
**➡️Dive deeper! - Checkout my playlist "Data Engineering Career" with details of all of the above -** [https://www.youtube.com/watch?v=5b4CIon\_1pY&list=PLYAUClNVzmDN5D9IW-COX0xy\_8fz8r51k&ab\_channel=AnalyticsVector](https://www.youtube.com/watch?v=5b4CIon_1pY&list=PLYAUClNVzmDN5D9IW-COX0xy_8fz8r51k&ab_channel=AnalyticsVector)
Thanks, hope it added some value! All the best!
https://redd.it/1hzf2uf
@r_devops
|Databricks Certified Data Engineer Professional|Advanced data engineering skills on Databricks platform.|$200||
|**SnowPro Core Certification ⭐**|Foundational knowledge of Snowflake architecture and operations.|$175||
|SnowPro Advanced Certification|Advanced expertise in complex Snowflake solutions and optimizations.|$375||
|SnowPro Advanced: Data Engineer|Data modeling, ETL, and tuning on Snowflake.|$375||
|Astronomer Certification for Apache Airflow Fundamentals|Core Apache Airflow concepts, including DAG authoring and scheduling.|$150|Mark Lamberti's Udemy course|
|Confluent Certified Developer for Apache Kafka|Developing applications with Kafka, architecture, and APIs.|$150||
|dbt Analytics Engineering Certification|Building and maintaining data workflows with dbt.|$200||
|HashiCorp Certified: Terraform Associate|Managing cloud resources using Terraform.|$70||
|Data Management Fundamentals Exam|Core principles: data architecture, governance, and quality.|$311||
|Data Governance Specialty|Best practices for governance, compliance, and data quality.|$311||
Tips to save money on these:
* AWS offers 50% discount on next exam: So after you give your first certification you can use a coupon code for the next ones.
* Azure - Coursera prep courses for Azure certifications offer 50% exam discount upon completion.
* For Airflow Fundamentals - Astronomer sometimes runs a promotion to get the certification for free. Follow them/Marc on LinkedIn to know the dates - I got mine in Jan last year.
**➡️Dive deeper! - Checkout my playlist "Data Engineering Career" with details of all of the above -** [https://www.youtube.com/watch?v=5b4CIon\_1pY&list=PLYAUClNVzmDN5D9IW-COX0xy\_8fz8r51k&ab\_channel=AnalyticsVector](https://www.youtube.com/watch?v=5b4CIon_1pY&list=PLYAUClNVzmDN5D9IW-COX0xy_8fz8r51k&ab_channel=AnalyticsVector)
Thanks, hope it added some value! All the best!
https://redd.it/1hzf2uf
@r_devops
Cross-platform alternative to npm scripts
I really enjoy the ability to create alias for a command using "package.json.scripts" in Node. Is there a lightweight tool that allow me to do that cross-platform, workspace-first and allow comments?
P/s: I've looked into just, make, mise but they don't seem to do what I want. Mise seems to be very close but it is also quite heavy to install.
https://redd.it/1hzlzgw
@r_devops
I really enjoy the ability to create alias for a command using "package.json.scripts" in Node. Is there a lightweight tool that allow me to do that cross-platform, workspace-first and allow comments?
P/s: I've looked into just, make, mise but they don't seem to do what I want. Mise seems to be very close but it is also quite heavy to install.
https://redd.it/1hzlzgw
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
University and learning - devops
Hi, in just over a year, I'll finish my studies in telecommunications and IT. I have a broad foundation in various IT topics, a bit more knowledge in networking, but nothing particularly advanced apart from that. Based on my interests, I’d like to follow the DevOps path and learn more about it before starting an internship.
What learning path should I choose? What steps should I take to make it logical and effective? What should I focus on the most? Should I use platforms like Kode Kloud, for example?
And one more thing: what are the intermediate job positions for someone aiming to become a DevOps engineer in the future?
https://redd.it/1hzobzo
@r_devops
Hi, in just over a year, I'll finish my studies in telecommunications and IT. I have a broad foundation in various IT topics, a bit more knowledge in networking, but nothing particularly advanced apart from that. Based on my interests, I’d like to follow the DevOps path and learn more about it before starting an internship.
What learning path should I choose? What steps should I take to make it logical and effective? What should I focus on the most? Should I use platforms like Kode Kloud, for example?
And one more thing: what are the intermediate job positions for someone aiming to become a DevOps engineer in the future?
https://redd.it/1hzobzo
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
Best practice to filter out bot traffic and spammers?
I launched a site which gets some pretty heavy traffic currently. When i look into my logs it seems like a good chunk of the traffic is trying some wordpress vulnz and other stuff i dont want to have on my server.
Whats the best way to mitigate this traffic? I already have fail2ban configured for SSH. Would it be suitable to use some other tooling?
https://redd.it/1hzuyth
@r_devops
I launched a site which gets some pretty heavy traffic currently. When i look into my logs it seems like a good chunk of the traffic is trying some wordpress vulnz and other stuff i dont want to have on my server.
Whats the best way to mitigate this traffic? I already have fail2ban configured for SSH. Would it be suitable to use some other tooling?
https://redd.it/1hzuyth
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
Display host cursor when recording a video on headless windows machine via RDP
I am connecting to a headless windows machine via RDP hosted on Google Cloud Platform via Remote Desktop. I need to record some videos on this windows machine. However, every time I try to record something, the mouse cursor does not appear. I am wondering how I can display the host's cursor on a headless windows machine? All of the solutions I've seen so far have been able to access the host device's hardware, but in my case, I cannot do that.
https://redd.it/1hzxrzs
@r_devops
I am connecting to a headless windows machine via RDP hosted on Google Cloud Platform via Remote Desktop. I need to record some videos on this windows machine. However, every time I try to record something, the mouse cursor does not appear. I am wondering how I can display the host's cursor on a headless windows machine? All of the solutions I've seen so far have been able to access the host device's hardware, but in my case, I cannot do that.
https://redd.it/1hzxrzs
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
Communities in Pittsburgh
Anyone know of any AWS communities in Pittsburgh. I’m new to learning cloud and would like to connect with others to learn and attend events etc.
https://redd.it/1i003rv
@r_devops
Anyone know of any AWS communities in Pittsburgh. I’m new to learning cloud and would like to connect with others to learn and attend events etc.
https://redd.it/1i003rv
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
anyone has experience working at this place?
husband got offered devops position in Miami by millennium management, how is there tech work culture? high pressure like their finance division? crazy work hours? treatment ? thanks !
https://redd.it/1i02918
@r_devops
husband got offered devops position in Miami by millennium management, how is there tech work culture? high pressure like their finance division? crazy work hours? treatment ? thanks !
https://redd.it/1i02918
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
Next level after DevOps (what role is better paid: SRE, DevSecOps, MlOps, Platform Engineer, Cloud Engineer)
Currently DevOps, looking forward to reaching the next level and earn more.
What role is better paid and future proof: SRE, DevSecOps, MlOps, Platform Engineer or Cloud Engineer, etc.?
https://redd.it/1i02szp
@r_devops
Currently DevOps, looking forward to reaching the next level and earn more.
What role is better paid and future proof: SRE, DevSecOps, MlOps, Platform Engineer or Cloud Engineer, etc.?
https://redd.it/1i02szp
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community
How are you tracking changes in 3rd party tools that could disrupt your CI/CD pipelines?
I am curious on how you all in the devop world keep track of SaaS application updates to keep on top of potential breaking/high impact changes.
https://redd.it/1i04tby
@r_devops
I am curious on how you all in the devop world keep track of SaaS application updates to keep on top of potential breaking/high impact changes.
https://redd.it/1i04tby
@r_devops
Reddit
From the devops community on Reddit
Explore this post and more from the devops community