Reddit DevOps
268 subscribers
1 photo
31K links
Reddit DevOps. #devops
Thanks @reddit2telegram and @r_channels
Download Telegram
Provision serverless service with Terraform or not? (Planning to use GCP Cloud Run)

Hi, I would like to deploy several services on GCP Cloud Run and a bit unsure about the recommended way to provision the services.

Should I create it through Terraform or just use the "gcloud run deploy" command?

https://redd.it/1ga8ohi
@r_devops
Asking for advice

I'm computer science student, the job market in my country is hiring DevOps interns all the time for end of year internships, and I'm trying to get this opportunity since I'm really interested in a DevOps carrer. Can any of the Tech leads here, member of recruitment who is actively hiring DevOps engineers give me some advices on what makes someone a good candidate when it comes to DevOps.

I studied really hard for the last two years and I have good knowledge of DevOps practices and concepts. I've had so much hands on experience on different conepts (GitOps IaC Cloud) and technologies like Jenkins GitLab ArgoCD Ansible Terraform, also some CLI tools using Go and Python, projects on AWS GCP, and had some software engineering internships where I got the picture of how softwares are built and delivered.

I am really interested on what are the key skills that makes difference also the project you'd like to see in the resume.

I am ready to hear you feedback, also if possible I can share my resume with you so you can roast it.

Thank you 🙏🏻

https://redd.it/1gabhh3
@r_devops
Need help with Google Oauth 2 for Argo Workflows DEX authentication using Argo CD Dex

I went through the documentation that argo provides for adding dex authentication using the dex server that argo cd has, it was a bit weird with many fields in the current values yaml file in the helm chart not matching position or even name. I got google's oauth2 working on argo cd with dex using the default config provided in the values file for the helm chart. The problem is when adding the same dex auth method to argo workflows isn't as simple as argo workflows requires a service account so I followed the documentation to map a service account to a group, this requires reinstalling argo workflows so I did that then instead asking me to choose an account I get

# Access blocked: authorisation errorAccess blocked: authorisation error

Some requested scopes were invalid. {valid=[openid\], invalid=[groups\]} Learn more about this errorIf you are a developer of invite automation, see error details.Error 400: invalid_scope

does anyone here know how to implement argo cd dex authentication on the argo server used by argo workflows?

https://redd.it/1gaanw3
@r_devops
Request for Features OneUptime: Open source observability platform.

We're building an open source observability platform - OneUptime (https://oneuptime.com). Think of it as your open-source alternative to Datadog, NewRelic, PagerDuty, and Incident.io—100% FOSS and Apache Licensed.

Already using OneUptime? Huge thanks! We’d love to hear your feedback.

Not on board yet? We’re curious why and eager to know how we can better serve your needs. What features would you like to see implemented? We listen to this community very closely and will ship updates for you all.

Looking forward to hearing your thoughts and feedback!

https://redd.it/1gag4vx
@r_devops
Avoiding unexpcted overae

For those managing multiple APIs, how do you keep track of usage and avoid unexpected overages?

https://redd.it/1gagvwv
@r_devops
I wrote a piece on the evolution in the field of automation we're witnessing nowadays. I will be humbled to get the feedback on it and discuss the topic with the devops community.

hey!

some time ago a thought struck me: what if I started writing about my experiences from a day-to-day work as a data engineer? I have a knack for automating stuff so I genuinely wanted to focus on this topic.

I enjoy discussing with fellow thinkers about the topics of automation, technology, and artificial intelligence. I hope that showcasing my thought process and point of view via a longer text will allow people that find this interesting to reach out to me and/or provide some feedback, ideally to discuss the subjects I stir.

I've been recently thinking a lot about the progress we're witnessing in the field of generative AI, especially in a broader context of evolving automation—it's not just gears and gadgets anymore. I'm persuaded we're stepping into the third era of automation: intelligence, after automating physical labor and calculation. It's an exciting, inevitable, and challenging journey.

the link below will take you to the piece I've prepared to organize how I think about the automation evolution and how to find my way in the changing world (no LLM participated in the writing process :) )

🔗 https://toolongautomated.substack.com/p/automation-unbound

I dive into the following topics:

👉 the three eras of automation: physical labor, calculation, and intelligence.

👉 automation in our daily lives: whether we like it or not, automation is everywhere.

👉 lessons from history: what the past teaches us about adapting to a world increasingly shaped by machines.

I'd be humbled to hear your feedback on the piece, and hope to have some discussion about the subjects:

1. are you afraid and/or skeptical about progressing automation and AI?
2. do you enjoy discussing this subject or are you rather reluctant to do that?
3. if an artifact (a.k.a. indirect intelligence) is created by what I call direct intelligence (human) and that artifact appears to be a synthetic being, then should we call this artifact direct intelligence?

https://redd.it/1gahf5n
@r_devops
Detect and fix bugs early with AI

Just read an article about Early - an AI tool designed to catch bugs before they become a problem. I'm curious about how this could impact our daily coding practices and overall project timelines.

Do you think integrating AI like this can enhance our productivity and code quality? Have any of you had experiences with similar tools that you found beneficial or challenging?



https://redd.it/1gajuke
@r_devops
database devops schema changes

how do you guys do database schema changes in your team
your devops owns it or devs

are your schema changes tracked using flyway/other tool first in dev db and then same moved to prod

in ours prod db is separate and sql file changes are applied manually and no schema change due to db team review process,approvals in prod.



https://redd.it/1gakqi1
@r_devops
Doing certifications makes me feel like an idiot, does everyone experience this ?

So I have been working in the industry for maybe 8 years total and 5 years in my current full stack developer role (dev / testing / deployment all in one role). However I have been told I need to complete a industry certified exam If I want to go for promotion.

At work we have a 4 day training event on for ISTQB syllabus 4 so thought that would be a good one to go to as i do lots of the testing for the team and id say im fairly good at it. Its only 2 days in an I feel like an idiot having done about 5 mock exams im averaging 40-50% which is terrible when you need 70% for a pass.

Im just having real issues in two places

The I have no idea to this question it has never come up / will never come up in my Job how would I know
Terminology being used in exam and in company meaning different things.

For example we were talking about testing and executing lines, this is referring to the "lines" in a logic flow diagram not executing lines of code or what our team calls Units tests are referred to as component tests in the exam, what our team calls smoke tests are referred to as system integration tests and our acceptance tests would actually be called regression tests based on the syllabus.

It just really annoying and has sort of angered me that I have been able to do full penetration testing plans, setup tests environments with test data, been involved with full end to end tests across multiple services and even made our teams first ever AWS S3 conmectivity tests for connecting to cloud services but can not pass a Foundation level Certification Exam on testing.

https://redd.it/1gaiit4
@r_devops
How much should I get paid

Some friend is asking me to do some terraform IaC for its company. However, I’m not sure how much it costs. Could you give an advice about the price of the following work or what I have to consider to give a reasonable price:
- create a terraform module for a product they made on azure cloud
- implement an azure DevOps pipeline to deploy infrastructure changes on azure (CD/CI)

Thanks for your help

https://redd.it/1gaqpli
@r_devops
Pivoting into cloud engineering may be tough...

Hey DevOps folks,
After running my first workshop, *A Day in the Life of a Cloud Engineer*, it hit me just how frustrating this career path has become for many of you. The **outsourcing of entry-level cloud roles** has made it feel like no matter how many certifications you earn or skills you build, companies will still look past you. It’s disheartening, and worse, it leaves a lot of smart and capable professionals wondering if they’ll ever get a real chance to enter this space.

That’s why I’ve put together a **free workshop series** to help you overcome these challenges. We’ll focus on:

* **Key skills that employers actually care about** so you can focus your energy
* **Building your first cloud project** to prove you can solve real problems
* **Navigating interview techniques** to stand out, even in this competitive market

If this resonates with you, check the link in my profile to join. And if you’re navigating these struggles too, connect with me on LinkedIn—I’d love to chat and help however I can!

https://redd.it/1garq3g
@r_devops
Record your terminal history to create executable runbooks

I am building Savvy as a new kind of terminal recording tool that lets you edit, run and share the recordings in a way that Asciinema does not support.

It also has local redaction to avoid sharing sensitive data, such as API tokens, PII, customer names, etc. Example runbook: https://app.getsavvy.so/runbook/rb\_b5dd5fb97a12b144/How-To-Retrieve-and-Decode-a-Kubernetes-Secret

What are some tools y'all are using to create/store runbooks ?

https://redd.it/1gasf1s
@r_devops
How much time do you spend fixing issues?

I'm considering going for devops, I have a background as a backend developer. My question is how much(maybe in %) of your time do you spend fixing issues and how much do you spend actually deploying new infrastructure, configuring and other typical devops tasks. Thanks

https://redd.it/1gavxyk
@r_devops
GitOps vs dynamic updates to K8s objects

I am a bit new to GitOps and wondering what everyone thinks about programmatic creation and updates to Kubernetes objects when the application is otherwise managed by FluxCD, for instance. Is it really an antipattern?

In detail:

We have a central team managed Kubernetes cluster, where we can deploy our applications through GitOps. Now, we are building a platform (i.e., common stuff for many similar applications) that would programmatically interact with the kube-apiserver to update ConfigMaps, fire up Jobs, for starters. This is to decouple the business applications from the target environment.

Do you think we should not do it? I know that we technically can do it, it has worked in a PoC environment, but the central team says we should not do it, because it is against the GitOps principles. What do you all think?

(We could use HPA, KEDA, sidecars so that we can avoid live kube-apiserver interactions, but should we? Especially if we can implement the functionality with basic k8s objects.)

https://redd.it/1gawqbt
@r_devops
The biggest compliment i've ever received.

Earlier this year, I was working on a proof of concept involving the installation of an LDAP server and authentication via SSH. For that, I needed to enable SSH password authentication [I can already hear you typing. I KNOW!!\] to make it work. I ran into a lot of issues with the latest Ubuntu and felt like I was banging my head against the wall until I finally found the solution. I decided to share my findings on superuser.com to help anyone else who might encounter the same problem.

Fast forward to today, [I check my email once every 3-4 days; currently, I have over 2,000 unread emails\], but one in particular caught my attention. I received this particular email 2 days ago, It reads:

Hi!
I'm not a `superuser.com` wbsite user and I can't write a DM to you, but I found your mail and I've just want to say thank you for your answer! I spend 2 hours on troubleshooting why I can't log into server ssh vias password... Again thanks and have a nice day (or night) whenever you'll read that xD

I'm deeply touched. I've never received an upvote via email before. Thank you, "Denis K"—you've made my day!

Email exchange.


Unread mail counter.

https://redd.it/1gaysnm
@r_devops
Cloud Exit Assessment: How to Evaluate the Risks of Leaving the Cloud

Dear all,

**I intend this post more as a discussion starter, but I welcome any comments, criticisms, or opposing views.**

I would like to draw your attention for a moment to the topic of 'cloud exit.' While this may seem unusual in a DevOps community, I believe most organizations lack an understanding of the vendor lock-in they encounter with a cloud-first strategy, and there are limited tools available on the market to assess these risks.

Although there are limited articles and research on this topic, you might be familiar with it from the mini-series of articles by DHH about leaving the cloud: 
[https://world.hey.com/dhh/why-we-re-leaving-the-cloud-654b47e0](https://world.hey.com/dhh/why-we-re-leaving-the-cloud-654b47e0) 
[https://world.hey.com/dhh/x-celebrates-60-savings-from-cloud-exit-7cc26895](https://world.hey.com/dhh/x-celebrates-60-savings-from-cloud-exit-7cc26895)

(a little self-promotion, but (ISC)² also found my topic suggestion to be worthy: [https://www.isc2.org/Insights/2024/04/Cloud-Exit-Strategies-Avoiding-Vendor-Lock-in](https://www.isc2.org/Insights/2024/04/Cloud-Exit-Strategies-Avoiding-Vendor-Lock-in))

It's not widely known, but in the European Union, the European Banking Authority (EBA) is responsible for establishing a uniform set of rules to regulate and supervise banking across all member states. In 2019, the EBA published the "Guidelines on Outsourcing Arrangements" technical document, which sets the baseline for financial institutions wanting to move to the cloud. This baseline includes the requirement that organizations must be prepared for a cloud exit in case of specific incidents or triggers.

Due to unfavorable market conditions as a cloud security freelancer, I've had more time over the last couple of months, which is why I started building a unified cloud exit assessment solution that helps organizations understand the risks associated with their cloud landscape and supports them in better understanding the risks, challenges and constraints of a potential cloud exit. The solution is still in its early stages (I’ve built it without VC funding or other investors), but I would be happy to share it with you for your review and feedback.

The 'assessment engine' is based on the following building blocks:

1. **Define Scope & Exit Strategy type:** For Microsoft Azure, the scope can be a resource group, while for AWS, it can be an AWS account and region.
2. **Build Resource Inventory:** List the used resources/services.
3. **Build Cost Inventory:** Identify the associated costs of the used resources/services.
4. **Perform Risk Assessment:** Apply a pre-defined rule set to examine the resources and complexity within the defined scope.
5. **Conduct Alternative Technology Analysis:** Evaluate the available alternative technologies on the market.
6. **Develop Report (Exit Strategy/Exit Plan):** Create a report based on regulatory requirements.

I've created a lighweight version of the assessment engine and you can try it on your own: 
[https://exitcloud.io/](https://exitcloud.io/) 
(No registration or credit card required)

Example report - EU: 
[https://report.eu.exitcloud.io/737d5f09-3e54-4777-bdc1-059f5f5b2e1c/index.html](https://report.eu.exitcloud.io/737d5f09-3e54-4777-bdc1-059f5f5b2e1c/index.html)
(for users who do not want to test it on their own infrastructure, but are interested in the output report \*)

*\* the example report used the 'Migration to Alternate Cloud' exit strategy, which is why you can find only cloud-related alternative technologies.*

To avoid any misunderstandings, here are a few notes:

* The lightweight version was built on Microsoft Azure because it was the fastest and simplest way to set it up. (Yes, a bit ironic…)
* I have no preference for any particular cloud service provider; each has its own advantages and disadvantages.
* I am neither a frontend nor a hardcore backend developer, so please excuse me if the aforementioned lightweight version contains some 'hacks.'
* I’m not
trying to convince anyone that the cloud is good or bad.
* Since a cloud exit depends on an enormous number of factors and there can be many dependencies for an application (especially in an enterprise environment), my goal is not to promise a solution that solves everything with just a Next/Next/Finish approach.

Many Thanks,
Bence.

https://redd.it/1gayf4t
@r_devops
Using ServiceConnection env variables

Hi there,

I've been trying to wrap my head around this. I'm fairly new to devops, so far I've been placing variables (such as tenantid, clientid etc) in the scripts themselves.
Then figured out a way to create 1 variables.yaml file per tenant, so that made things a bit nicer already.

Now I've run into something, I cant seem to get to work.

If I understand correctly, I should be able to extract the info such as tenantid, clientid, but also the accesstoken from the Service Connection I've configured for the Project in DevOps, using these $env: parameters

$env:AZURE_TENANT_ID
$env:AZURE_CLIENT_ID
$env:AZURE_ACCESS_TOKEN

I've modified my main.yaml with to set addSpnToEnvironment to true.
Ive added them as arguments to the script line.

Yet still when running the pipeline, the script returns these variables as empty.

The App Registration has API permissions for Directory.Read.All and Application.Read.All

So I believe that should be sufficient.

Can anyone please help me along? I'm starting to chase my own tail right now, ending up in circles with things I've already tried :)

Purpose of the script: Create a test script to figure out how to send emails from DevOps pipelines, using graph api. In the end we want to use this for all sorts of matters of automated tasks (clean up inactive devices, verify specific SAML settings for enterprise apps, whatever else you can think off that you can script which would reduce the daily workload of repetitive tasks).

Right now the PS1 is a bit of a mess, because of a full day of testing, modifying etc.

MAIN.YAML:

trigger: none
schedules:
  - cron: "0 0 1 "  # Run at midnight on the first day of every month
    displayName: Run once a month
    branches:
      include:
        - main
    always: true

pool:
  vmImage: 'windows-latest'

steps:
  - task: AzureCLI@2
    inputs:
      azureSubscription: 'Repo-EntraID'
      scriptType: 'ps'
      addSpnToEnvironment: true
      scriptLocation: 'inlineScript'
      inlineScript: |
        # Call the SendMailMessage script with the environment variables
        .\SendMailMessage\SendMailMessage.ps1 -AccessToken $env:AZUREACCESSTOKEN -TenantId $env:AZURETENANTID -ClientId $env:AZURECLIENTID
    displayName: 'Send Email using Microsoft Graph and Service Connection'




SendMailMessage.ps1

param (
    string$TenantId,
    string$ClientId,
    string$AccessToken
)

# Convert the access token to a secure string
Write-Host "Converting access token to secure string..."
$secureAccessToken = ConvertTo-SecureString $AccessToken -AsPlainText -Force

# Parameters for the email
$EmailSender = 'servicepunt@<domainname>'
$Recipient = '<my own mailaddress>'
$Subject = 'DevOps mail'
$Body = 'This is a mail from DevOps MDK'

# Show parameters
Write-Host "Starting script execution..."
Write-Host "From: $EmailSender"
Write-Host "To: $Recipient"
Write-Host "Subject: $Subject"
Write-Host "Body: $Body"
Write-Host "TenantID: $TenantId"
Write-Host "ClientID: $ClientId"
Write-Host "TenantID env: $env:AZURETENANTID"
Write-Host "ClientID env: $env:AZURECLIENTID"

# Check if AccessToken is empty
Write-Host "Checking if AccessToken is empty..."
if (string::IsNullOrWhiteSpace($AccessToken)) {
    Write-Error "AccessToken is empty. Please check your service connection and ensure it has the necessary permissions."
    exit 1  # Exit the script with a non-zero status code
}

Write-Host "Connecting to Microsoft Graph..."
Connect-MgGraph -AccessToken $secureAccessToken -NoWelcome

# Prepare headers for further API calls
Write-Host "Preparing headers for API calls..."
$header = @{
    'Authorization' = "Bearer $AccessToken"
}

# Verify connection to Microsoft Graph
Write-Host "Verifying connection to Microsoft
Graph..."
try {
    $graphProfileUrl = "https://graph.microsoft.com/v1.0/me"
    $profileResponse = Invoke-RestMethod -Uri $graphProfileUrl -Method Get -Headers $header

    Write-Host "Successfully connected to Microsoft Graph. User profile information retrieved:"
    Write-Host "User Display Name: $($profileResponse.displayName)"
} catch {
    Write-Error "Failed to connect to Microsoft Graph with the provided AccessToken: $"
    exit 1  # Exit the script with a non-zero status code
}

# Microsoft Graph API URL for sending mail
$mailSendUrl = "
https://graph.microsoft.com/v1.0/users/$EmailSender/sendMail"

# Compose Email
Write-Host "Composing email..."
$emailBody = @{
    message = @{
        subject = $Subject
        body = @{
            contentType = "Text"
            content     = $Body
        }
        toRecipients = @(
            @{
                emailAddress = @{
                    address = $Recipient
                }
            }
        )
        from = @{  # Specify the sender
            emailAddress = @{
                address = $EmailSender
            }
        }
    }
}

# Send Email using Microsoft Graph API
Write-Host "Sending email using Microsoft Graph API..."
try {
    $response = Invoke-RestMethod -Uri $mailSendUrl -Method Post -Headers $header -Body ($emailBody | ConvertTo-Json) -ContentType "application/json"

    if ($response.StatusCode -ge 200 -and $response.StatusCode -lt 300) {
        Write-Host "Email sent successfully."
    } else {
        Write-Host "Failed to send email with status code: $($response.StatusCode)"
    }
} catch {
    Write-Error "An error occurred while sending the email: $
"
}







https://redd.it/1gb2835
@r_devops