Reddit DevOps
271 subscribers
21 photos
31.3K links
Reddit DevOps. #devops
Thanks @reddit2telegram and @r_channels
Download Telegram
AWS NLB stuck on pending on new KOPS cluster

I have a new KOPS cluster I created today and am trying to get the cluster to apply a NLB so I can have my ingress work. I am using the YAML provided here: https://kubernetes.github.io/ingress-nginx/deploy/#aws \- I have taken the file and split it up into it's own sections and all depoys fine. Nothing wrong except the service for the load balancer is stuck in the pending stage and describing the service does nothing useful other than tell me how long it has been in that state.

Bottom of describe

Normal EnsuringLoadBalancer 103s (x47 over 3h27m) service-controller Ensuring load balancer

My ingress.yaml file

apiVersion: networking.k8s.io/v1beta1
# apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
annotations:
# add an annotation indicating the issuer to use.
kubernetes.io/ingress.class: "nginx"
cert-manager.io/cluster-issuer: "letsencrypt-stage"
# needed to allow the front end to talk to the back end
nginx.ingress.kubernetes.io/cors-allow-origin: "https://api.dev.mydomain.ca"
nginx.ingress.kubernetes.io/cors-allow-credentials: "true"
nginx.ingress.kubernetes.io/enable-cors: "true"
nginx.ingress.kubernetes.io/cors-allow-methods: "GET, PUT, POST, DELETE, PATCH, OPTIONS"
# needed for monitoring - maybe
prometheus.io/scrape: "true"
prometheus.io/port: "10254"
#for nginx ingress controller
ad.datadoghq.com/nginx-ingress-controller.checknames: '["nginx","nginxingresscontroller"]'
ad.datadoghq.com/nginx-ingress-controller.initconfigs: '{},{}'
ad.datadoghq.com/nginx-ingress-controller.instances: '{"nginx_status_url": "https://%%host%%:18080/nginx_status"},{"prometheus_url": "https://%%host%%:10254/metrics"}'
ad.datadoghq.com/nginx-ingress-controller.logs: '{"service": "controller", "source":"nginx-ingress-controller"}'
name: nginx-ingress
namespace: custom-namespace
spec:
rules:
- host: api.dev.mydomain.ca
http:
paths:
- backend:
serviceName: express-api
servicePort: 8090
path: /
- host: socket.dev.mydomain.ca
http:
paths:
- backend:
serviceName: socketio
servicePort: 9000
path: /
tls:
- hosts:
- api.dev.mydomain.ca
secretName: express-ingress-cert
- hosts:
- socket.dev.mydomain.ca
secretName: socket-ingress-cert

I am wondering how I can get an NLB to provision and allow me to point DNS at it and have the above ingress resource direct traffic where it needs to go.

https://redd.it/nq7yie
@r_devops
Deploy a Containerized React App to AWS, Azure, Google Cloud

Learn how to create a production-ready Docker image with React. Then we will use that Docker Image to push it to the Container Registry and we will Host it to AWS Fargate, Azure Container Instance, Google Run

Deploy a React App to AWS: https://youtu.be/9nrgqtFHMUc

Deploy a React App to Azure: https://youtu.be/1gEMiFil4q4

Deploy a React App to Google Cloud: https://youtu.be/82Z\_VrazXcs

https://redd.it/nzqa0q
@r_devops
Pulling/Pushing out any AWS ECR images from/to AWS ECR through AWS Route53 CNAME

The main idea of this article is to show how to use CNAME of Route53 to pull or/and push images from/to AWS ECR service. By default, Amazon doesn’t allow to do it (SSL handshake is not working. SSL has been signed by Amazon side).

I played around with Amazon API and Python and proxy and I have found several solutions:

* Use Python and develop wrapper to log in, pull & push AWS ECR images from/to ECR through AWS Route53 CNAME of AWS ECS service.
* Use Some proxy (Ex: Nginx, Traefik, etc) and make forwarding rules with needed headers. This implementation is \`TBD\` soon!
* Amazon ECR interface VPC endpoints (AWS PrivateLink).

The full article you can read here: [https://medium.com/@solo.metalisebastian/pulling-pushing-out-any-aws-ecr-images-from-to-aws-ecr-through-aws-route53-cname-7c92307f9c25](https://medium.com/@solo.metalisebastian/pulling-pushing-out-any-aws-ecr-images-from-to-aws-ecr-through-aws-route53-cname-7c92307f9c25)

The code: [https://github.com/SebastianUA/ecr-pull-push](https://github.com/SebastianUA/ecr-pull-push)

\#AWS #AWSRoute53 #Docker #AWSECR

https://redd.it/qxo7hr
@r_devops
Running AWS Services In A Laptop Using LocalStack

LocalStack is a fully functional mock of AWS services running locally on your computer. We can use it to develop and test cloud and serverless apps offline. It can run through the CLI, in a Docker container, or in a Kubernetes cluster. We can use it to create mocks of S3 buckets, Lambda functions, RDS databases, ECR repositories, and more.

https://www.youtube.com/watch?v=8hi9P1ffaQk

https://redd.it/qy3xqg
@r_devops
Create and Restore RDS Snapshot in a specific time

If you want to learn this thing in a more simple way

then you can refer to the given link👇👇

https://www.youtube.com/watch?v=2qGKr5gn5wo&t=229s

What is Amazon RDS?

Amazon Relational Database Service (Amazon RDS) makes it easy to set up, operate, and scale a relational database in the cloud. It provides cost-efficient and resizable capacity while automating time-consuming administration tasks such as hardware provisioning, database setup, patching, and backups. It frees you to focus on your applications so you can give them the fast performance, high availability, security, and compatibility they need.

​

To create a DB instance

1. Sign in to the AWS Management Console and open the Amazon RDS console at https://console.aws.amazon.com/rds/.
2. In the upper-right corner of the Amazon RDS console, choose the AWS Region in which you want to create the DB instance.
3. In the navigation pane, choose Databases.
4. Choose Create database.
5. In Choose a database creation method, select Standard Create.
6. In Engine options, choose the engine type: MariaDB, Microsoft SQL Server, MySQL, Oracle, or PostgreSQL. Microsoft SQL Server is shown here.
7. For Edition, if you're using Oracle or SQL Server choose the DB engine edition that you want to use.
MySQL has only one option for the edition, and MariaDB and PostgreSQL have none.
8. For Version, choose the engine version.
9. In Templates, choose the template that matches your use case. If you choose Production, the following are preselected in a later step:

Multi-AZ failover option
Provisioned IOPS storage option
Enable deletion protection option
We recommend these features for any production environment.

1. To enter your master password, do the following:

1. In the Settings section, open Credential Settings.
2. If you want to specify a password, clear the Auto-generate a password check box if it is selected.
3. (Optional) Change the Master username value.
4. Enter the same password in Master password and Confirm password.
2. For the remaining sections, specify your DB instance settings. For information about each setting, see Settings for DB instances.
3. Choose Create database.
If you chose to use an automatically generated password, the View credential details button appears on the Databases page.
To view the master user name and password for the DB instance, choose View credential details.
4. For Databases, choose the name of the new DB instance.
On the RDS console, the details for the new DB instance appear. The DB instance has a status of creating until the DB instance is created and ready for use. When the state changes to Available, you can connect to the DB instance. Depending on the DB instance class and storage allocated, it can take several minutes for the new instance to be available.

​

TO RESTORE A DB INSTANCE TO A SPECIFIED TIME

1. Sign in to the AWS Management Console and open the Amazon RDS console at https://console.aws.amazon.com/rds/.
2. In the navigation pane, choose Automated backups.
3. Choose the DB instance that you want to restore.
4. For Actions, choose to Restore to point in time.
The Restore to point in time window appears.
5. Choose the Latest restorable time to restore to the latest possible time, or choose Custom to choose a time.
If you chose Custom, enter the date and time to which you want to restore the instance.
6. For the DB instance identifier, enter the name of the target restored DB instance. The name must be unique.
7. Choose other options as needed, such as DB instance class, storage, and whether you want to use storage autoscaling.
8. Choose to Restore to point in time.

https://redd.it/r6d6cr
@r_devops
Use GOTOAWS to simplify the AWS CLI tool.

GoToAWS is a tool that simplifies the AWS CLI for several operations.

I'm not sure how well-known it is, so I wanted to show it off. This video is short and digestible, so I hope you all enjoy it.

https://www.youtube.com/watch?v=uLtx1PUUZJQ

Let me know if you have any questions!

Cheers!

https://redd.it/r9ijro
@r_devops
Mount S3 Objects to Kubernetes Pods

One of our customers asked for a solution to mount large files from S3 transparently to EKS pods.

Here's our solution - complete with a Docker image and a Helm chart:

https://dev.to/otomato\_io/mount-s3-objects-to-kubernetes-pods-12f5

\#kubernetes #eks #aws

https://redd.it/sgx57e
@r_devops
New Route53 Cli release in - Get info about your records from the terminal - quickly!

New Release - r53

Example:

r53 -q
my.company.domain.com

It will return a list:

Hosted Zone ID + Web URL
The target behind (Load balancer, Lambda, etc) + Web URL to target
Recursively expand records
Verify NS match with dig

Install:

$ brew tap isan-rivkin/toolbox

$ brew install r53

New features:

\- Exposed SDK on top of CLI

\- Now support JSON output --output-json

https://github.com/Isan-Rivkin/route53-cli

​

\#aws #route53 #golang #go #dns #networking

https://redd.it/u4u0it
@r_devops
how to automate AWS marketplace publishing with Ansible - A beginner's guide

Hello everyone,

I've been a long-time subscriber to this subreddit, but this is my first post. I recently published an article on automating AWS marketplace publishing using Ansible. If you're new to Ansible or are looking to streamline your AWS marketplace publishing process, this article is for you!

In this article, I cover the basics of Ansible, how to create an EC2 instance, create an Amazon Machine Image (AMI), and how to use Ansible to automate the publishing process on the AWS marketplace.

I also share some tips and best practices for using Ansible to automate your AWS marketplace publishing.

You can find the article here: https://medium.com/@arshad.zameer/getting-started-with-ansible-for-aws-marketplace-publishing-a547cc13d182

I hope the article is helpful to you. If you have any questions or feedback, feel free to comment.

Thanks for reading!

\#Ansible #AWS #AWSMarketplace #Automation

https://redd.it/10iaq9a
@r_devops
Surf CLI - New Feature: Fuzzy search DynamoDB (even encoded data)

**DynamoDB:**

[https://github.com/Isan-Rivkin/surf#aws-dynamodb-usage](https://github.com/Isan-Rivkin/surf#aws-dynamodb-usage)

​

**TLDR**

* surf ddb --query "my-text-\*" --table "\^prod" --out json
* Pattern matching inside objects
* Additional Supported formats: JSON, Protobuf, Base64, Binary

​

**Supported Platforms**

* surf <platform> -q <some text>
* AWS Route53, DynamoDB, ACM, S3, Opensearch
* Elasticsearch
* [Logz.io](https://logz.io/)
* Hashicorp Vault, Consul

&#x200B;

**Overview**

SURF is built for Infrastructure Engineers as a CLI tool that enables searching any pattern across different platforms. Usually, the results are returned with a direct web URL.

The search process depends on the context, for example: if you're searching in Vault it'll pattern match against keys. Instead, if you're searching in Route53 AWS a DNS address it'll return links to the targets behind it (e.g Load balancer).

https://redd.it/10v119c
@r_devops
What is OpenID Connect Authentication? A Practical Guide

Hello, devops community,
Today, I present to you a topic that is less discussed and often taken for granted in our daily jobs.
OpenID Connect is among our industry's most widely used and least discussed topics.
Yet, it is so crucial when it comes to granting third-party access to a service provider. Have you seen those "sign-in with Google" before!?
In this guide, I will explain the notion of OIDC using a practical, real-world example: granting GitHub Actions access to an AWS account.
Feel free to ask any questions that come up.
https://developer-friendly.blog/2024/04/14/what-is-openid-connect-authentication-a-practical-guide/


\#oauth2 #oidc #github #aws

https://redd.it/1c4fbhh
@r_devops
How to Access AWS From Azure VM Using OpenID Connect

Do you work in a multi-cloud environment?

Do you usually find yourself passing around cloud credentials? Hasn't it ever felt kinda wrong?

Did you ask yourself if there's a better way around this?

I'm here to tell you that there is. There is a much better way to handle such service-to-service communications.

This blog post elaborates on what OpenID Connect can do to help you avoid passing around long-lived credentials, relieving you from the chore of frequent secret rotation.

If you enjoy this post, please share it with your network.

#aws #azure #oidc #openidconnect

https://developer-friendly.blog/2024/05/27/how-to-access-aws-from-azure-vm-using-openid-connect/



https://redd.it/1d1mjhd
@r_devops
AI agent, Multi-Cloud Support: AWS (using SageMaker) GCP (using Gemini)
Azure (using Copilot)

I'm building an AI assistant to guide the setup of cloud resources in a secure manner. 

Example prompt Setup a production-grade AWS foundation with the following requirements :
\- VPC in us-east-1 with 3 availability zones
\- Private and public subnets- Network segmentation for different workloads
\- Implement security best practices- Enable encryption for all services
\- Setup CloudTrail for audit logging
\- Configure AWS Backup for critical resources
\- Implement WAF and Shield for protection
https://medium.com/@rasvihostings/ai-agent-to-set-up-a-foundation-in-a-public-cloud-secure-manner-4a555d8fdd84

hashtag#googlecloud hashtag#aws hashtag#azure hashtag#gemini hashtag#SageMaker hashtag#Copilot hashtag#python hashtag#terraform

https://redd.it/1gf5m0b
@r_devops
Stuck between AWS and Azure — need your advice!

I’m about to dive into Cloud Computing, but I’m currently torn between starting with AWS or Azure.

I’ve heard the differences between them aren’t that big in terms of core concepts, and that Azure might be easier for beginners, especially with its user-friendly interface and Microsoft integration.

But I’m also thinking about the bigger picture:
• Which one has better career opportunities overall?
• Which one provides more flexibility and long-term growth?
• And is it true that once you learn one, switching to the other is relatively smooth?

Would love to hear your thoughts and experiences! Any advice or perspective is welcome 🙌

#CloudComputing #AWS #Azure #CareerGrowth #ITCareers #TechLearning

https://redd.it/1lpjif2
@r_devops
Looking for Job (Please Reply)

Hi Everyone,

I hope you’re all doing well.

I’m writing to express my interest in the Junior DevOps Engineer position. I recently completed a 3-month internship as a DevOps Intern.

I have good technical knowledge around DevOps skills and hands-on experience on major DevOps tools.

I worked on several real-world DevOps projects:

• Deployment of a MERN Stack application on AWS EKS with DevSecOps integration, Helm charts, and ArgoCD.
• Automated infrastructure monitoring using Terraform, Prometheus, Grafana, and AWS CloudWatch, including email alerts via AWS SNS for high CPU utilization.
• Serverless automation using AWS Lambda to delete stale AWS snapshots.

Additionally, I bring 4 years of corporate experience-not completely fresher. So, learning and adapting new skills and tools won’t be a big issue for me.

I’m now seeking a full-time opportunity as a Junior DevOps Engineer, where I can contribute, learn, and continue growing within a dynamic environment.

Thank you for your time and consideration. I would truly appreciate the opportunity to be part of your team.

#devops #aws #community #jobsearch #it #hr #hiring #opentowork #linkedintech #ithiring

https://redd.it/1o9ry8k
@r_devops