What is the difference between devops and SRE?
Dear colleagues.
What is the difference between devops and SRE?
Could you please provide an example?
Thanks in advance!
https://redd.it/n5xfix
@r_devops
Dear colleagues.
What is the difference between devops and SRE?
Could you please provide an example?
Thanks in advance!
https://redd.it/n5xfix
@r_devops
reddit
What is the difference between devops and SRE?
Dear colleagues. What is the difference between devops and SRE? Could you please provide an example? Thanks in advance!
Question about moving puppet infrastructure to docker
We use Jenkins to setup puppet infrastructure and install product. There are many 3 components involved. Puppetserver 6.x, Jenkins and Nginx acting as package manager. If this setup to be converted, what is best approach? like clubbing Jenkins and puppet server in on image OR seperate? Nginx will be a separate container.
https://redd.it/n5zzw2
@r_devops
We use Jenkins to setup puppet infrastructure and install product. There are many 3 components involved. Puppetserver 6.x, Jenkins and Nginx acting as package manager. If this setup to be converted, what is best approach? like clubbing Jenkins and puppet server in on image OR seperate? Nginx will be a separate container.
https://redd.it/n5zzw2
@r_devops
reddit
Question about moving puppet infrastructure to docker
We use Jenkins to setup puppet infrastructure and install product. There are many 3 components involved. Puppetserver 6.x, Jenkins and Nginx...
Help required to setup vault with RAFT HA and database storage backend.
I am trying to setup Hashicorp Vault with raft as high availability and postgres as storage backend with TLS enabled. The only problem I'm facing at the moment is that, I am unable to join the various vault nodes into the raft HA cluster.
I'm running vault on docker [ the three nodes are a part of the same docker network \] and used openssl to generate a self-signed certificate to test the TLS setup.
This is my vault.hcl
hastorage "raft" {
path = "/vault/file/"
nodeid = "vault3"
}
storage "postgresql" {
connectionurl = "postgres://<username>:<password>@postgres:5432/<dbname>?sslmode=disable"
}
listener "tcp" {
address = "0.0.0.0:8220"
tlscertfile = "/etc/certs/kms.crt"
tlskeyfile = "/etc/certs/kms.key"
}
defaultleasettl = "2208h"
maxleasettl = "4320h"
disablemlock = true
ui = true
clusteraddr = "https://vault3:8221"
apiaddr = "https://vault3:8220"
The first node, upon unseal and initialization, joins itself to a new raft cluster.
The second, which is unsealed using the keys generated upon init of the first node, goes into standby mode. When I try to join the second node into the raft cluster of the first node, I get the following error :
vault operator raft join -leader-client-cert=/etc/certs/kms.crt -leader-client-key=/etc/certs/kms.key
I also used the -client-cert and -client-key options, same error
core: attempting to join possible raft leader node: leaderaddr=https://vault1:8200
vault1 [INFO] http: TLS handshake error from 172.25.0.6:39286: remote error: tls: bad certificate
vault2 WARN core: join attempt failed: error="error during raft bootstrap init call: Put "https://vault1:8200/v1/sys/storage/raft/bootstrap/challenge": x509: certificate is not valid for any names, but wanted to match vault1"
vault2 [ERROR] core: failed to join raft cluster: error="failed to join any raft leader node"
I recreated the certificate with vault\1 as the FQDN, this gives me the following error :
core: attempting to join possible raft leader node: leaderaddr=https://vault1:8200
vault2 [WARN] core: join attempt failed: error="error during raft bootstrap init call: Put "https://vault1:8200/v1/sys/storage/raft/bootstrap/challenge": x509: certificate relies on legacy Common Name field, use SANs or temporarily enable Common Name matching with GODEBUG=x509ignoreCN=0"
vault2 [ERROR] core: failed to join raft cluster: error="failed to join any raft leader node"
vault1 INFO http: TLS handshake error from 172.18.0.5:47794: remote error: tls: bad certificate
I set the environment variable GODEBUG=x509ignoreCN=0, it didn't fix anything.
Any help would be much appreciated!
https://redd.it/n5znsz
@r_devops
I am trying to setup Hashicorp Vault with raft as high availability and postgres as storage backend with TLS enabled. The only problem I'm facing at the moment is that, I am unable to join the various vault nodes into the raft HA cluster.
I'm running vault on docker [ the three nodes are a part of the same docker network \] and used openssl to generate a self-signed certificate to test the TLS setup.
This is my vault.hcl
hastorage "raft" {
path = "/vault/file/"
nodeid = "vault3"
}
storage "postgresql" {
connectionurl = "postgres://<username>:<password>@postgres:5432/<dbname>?sslmode=disable"
}
listener "tcp" {
address = "0.0.0.0:8220"
tlscertfile = "/etc/certs/kms.crt"
tlskeyfile = "/etc/certs/kms.key"
}
defaultleasettl = "2208h"
maxleasettl = "4320h"
disablemlock = true
ui = true
clusteraddr = "https://vault3:8221"
apiaddr = "https://vault3:8220"
The first node, upon unseal and initialization, joins itself to a new raft cluster.
The second, which is unsealed using the keys generated upon init of the first node, goes into standby mode. When I try to join the second node into the raft cluster of the first node, I get the following error :
vault operator raft join -leader-client-cert=/etc/certs/kms.crt -leader-client-key=/etc/certs/kms.key
I also used the -client-cert and -client-key options, same error
core: attempting to join possible raft leader node: leaderaddr=https://vault1:8200
vault1 [INFO] http: TLS handshake error from 172.25.0.6:39286: remote error: tls: bad certificate
vault2 WARN core: join attempt failed: error="error during raft bootstrap init call: Put "https://vault1:8200/v1/sys/storage/raft/bootstrap/challenge": x509: certificate is not valid for any names, but wanted to match vault1"
vault2 [ERROR] core: failed to join raft cluster: error="failed to join any raft leader node"
I recreated the certificate with vault\1 as the FQDN, this gives me the following error :
core: attempting to join possible raft leader node: leaderaddr=https://vault1:8200
vault2 [WARN] core: join attempt failed: error="error during raft bootstrap init call: Put "https://vault1:8200/v1/sys/storage/raft/bootstrap/challenge": x509: certificate relies on legacy Common Name field, use SANs or temporarily enable Common Name matching with GODEBUG=x509ignoreCN=0"
vault2 [ERROR] core: failed to join raft cluster: error="failed to join any raft leader node"
vault1 INFO http: TLS handshake error from 172.18.0.5:47794: remote error: tls: bad certificate
I set the environment variable GODEBUG=x509ignoreCN=0, it didn't fix anything.
Any help would be much appreciated!
https://redd.it/n5znsz
@r_devops
reddit
Help required to setup vault with RAFT HA and database storage...
I am trying to setup Hashicorp Vault with raft as high availability and postgres as storage backend with TLS enabled. The only problem I'm facing...
Test API of docker container in Azure DevOps CI/CD pipeline
Hi!
I'm working on setting up some ci/cd pipelines for a couple of small containers. The pipeline should be as follow:
1. Build docker image
2. Start container
3. Query the REST api of said container
4. Make sure the response is "reasonable"
5. Push to ACR
6. Deploy to AKS
It's number 3 and 4 that I'm struggling with. It seems kinda basic but I haven't found any good resources online. I'm new to DevOps and I'm guessing I'm just googling the wrong terms, as this sounds like a basic and standard thing one would do in a pipeline. One way, I guess, would be to just
Any suggestions or references to online resources would be highly appreciated!
https://redd.it/n63w2v
@r_devops
Hi!
I'm working on setting up some ci/cd pipelines for a couple of small containers. The pipeline should be as follow:
1. Build docker image
2. Start container
3. Query the REST api of said container
4. Make sure the response is "reasonable"
5. Push to ACR
6. Deploy to AKS
It's number 3 and 4 that I'm struggling with. It seems kinda basic but I haven't found any good resources online. I'm new to DevOps and I'm guessing I'm just googling the wrong terms, as this sounds like a basic and standard thing one would do in a pipeline. One way, I guess, would be to just
docker run the container, curl it with a bash command, regex the response and run exit if the response contains "error". But I'm thinking there's probably a prettier solution out there.Any suggestions or references to online resources would be highly appreciated!
https://redd.it/n63w2v
@r_devops
reddit
Test API of docker container in Azure DevOps CI/CD pipeline
Hi! I'm working on setting up some ci/cd pipelines for a couple of small containers. The pipeline should be as follow: 1. Build docker image 2....
I developed a tool to train neural networks on AWS with a single command
Hey everyone,
My friend and I developed Nimbo, a dead-simple CLI that wraps AWS CLI, allowing you to run code on AWS as if you were running it locally. GitHub: https://github.com/nimbo-sh/nimbo. Docs: https://docs.nimbo.sh.
We decided to build this because we were frustrated with how cumbersome using AWS was, and we just wanted to be able to run jobs on AWS as easily as we run them locally. All in all, we didn't like the current AWS DevOps user experience, and we thought we could drastically simplify it for the machine learning/scientific computing niche.
For this reason, we also provide many useful commands to make it faster and easier to work with AWS, such as one-command Jupyter notebooks on EC2, easily checking prices, logging onto an instance, or syncing data to/from S3 (you can see some useful commands here).
Unlike other similar services, we are solely client-side, meaning that the code runs on your EC2 instances and data is stored in your S3 buckets (we don't have a server; all the infrastructure orchestration happens in the Nimbo package).
We have tons of ideas for Nimbo, such as docker support and one-command neural network deployments.
.
We are happy to receive any feedback and suggestions you have.
https://redd.it/n6486v
@r_devops
Hey everyone,
My friend and I developed Nimbo, a dead-simple CLI that wraps AWS CLI, allowing you to run code on AWS as if you were running it locally. GitHub: https://github.com/nimbo-sh/nimbo. Docs: https://docs.nimbo.sh.
We decided to build this because we were frustrated with how cumbersome using AWS was, and we just wanted to be able to run jobs on AWS as easily as we run them locally. All in all, we didn't like the current AWS DevOps user experience, and we thought we could drastically simplify it for the machine learning/scientific computing niche.
For this reason, we also provide many useful commands to make it faster and easier to work with AWS, such as one-command Jupyter notebooks on EC2, easily checking prices, logging onto an instance, or syncing data to/from S3 (you can see some useful commands here).
Unlike other similar services, we are solely client-side, meaning that the code runs on your EC2 instances and data is stored in your S3 buckets (we don't have a server; all the infrastructure orchestration happens in the Nimbo package).
We have tons of ideas for Nimbo, such as docker support and one-command neural network deployments.
.
We are happy to receive any feedback and suggestions you have.
https://redd.it/n6486v
@r_devops
nimbo.sh
Run jobs on AWS with a single command
Nimbo is a dead-simple CLI that allows you to run code on AWS as if you were running it locally.
Nimbo also provides many useful commands to supercharge your productivity when working with AWS,
such as easily checking prices, logging onto an instance…
Nimbo also provides many useful commands to supercharge your productivity when working with AWS,
such as easily checking prices, logging onto an instance…
build hello world java file in jenkins pipeline
hey folks,
how do we get the hello world class file and jar file and build , test, deploy , release them in Jenkins pipeline?? I am really stuck with creating pom.xml file for the java class file.
I also tried adding git repo in (scripting pipeline) but it says 'the recommended git is none and no credentials provided.
could anyone tell me the exact process to get the jar file and build it in jenkins.
https://redd.it/n65i19
@r_devops
hey folks,
how do we get the hello world class file and jar file and build , test, deploy , release them in Jenkins pipeline?? I am really stuck with creating pom.xml file for the java class file.
I also tried adding git repo in (scripting pipeline) but it says 'the recommended git is none and no credentials provided.
could anyone tell me the exact process to get the jar file and build it in jenkins.
https://redd.it/n65i19
@r_devops
reddit
build hello world java file in jenkins pipeline
hey folks, how do we get the hello world class file and jar file and build , test, deploy , release them in Jenkins pipeline?? I am really stuck...
Mac In cloud alternatives
Amazons new MAC EC2s are expensive and we’re unlikely to get approval to use them.
We currently use Mac In cloud it’s just the builds are slow.
https://redd.it/n69rft
@r_devops
Amazons new MAC EC2s are expensive and we’re unlikely to get approval to use them.
We currently use Mac In cloud it’s just the builds are slow.
https://redd.it/n69rft
@r_devops
reddit
Mac In cloud alternatives
Amazons new MAC EC2s are expensive and we’re unlikely to get approval to use them. We currently use Mac In cloud it’s just the builds are slow.
Oauth flow and its impact on infrastructure
Hello, first post here :)
I'm helping with an OAUTH / OpenID connect implementation by designing the infrastructure and the following issue took me by surprise: How's the deal once the final token was acquired by the app?
​
Let's assume the following scenario:
​
1. Company Inc has a service that verifies personal assets. Now Enterprise Inc wants to use Company's services in order to offload that verification.
2. Company Inc decides to implement Oauth as a way to allow more entities to use Company's services, and decides to eat their own dog food i.e. use Oauth internally.
3. So far so good, suddenly Little Business Ltd decides to send Company's 1000s of assets to be verified.
4. Once the final token was aqcuired by the backend app, how does the backend app know whether the token is still valid? Regardless of expiry time I mean. Should the backend app ask the authentication provider if the token is still valid? Does it need ask it inexorably via API endpoint for each transaction?
5. If the answer to the above question is more or less positive, does it mean I need to build a separate (and big!) infrastructure?
Thanks in advance!
https://redd.it/n68gya
@r_devops
Hello, first post here :)
I'm helping with an OAUTH / OpenID connect implementation by designing the infrastructure and the following issue took me by surprise: How's the deal once the final token was acquired by the app?
​
Let's assume the following scenario:
​
1. Company Inc has a service that verifies personal assets. Now Enterprise Inc wants to use Company's services in order to offload that verification.
2. Company Inc decides to implement Oauth as a way to allow more entities to use Company's services, and decides to eat their own dog food i.e. use Oauth internally.
3. So far so good, suddenly Little Business Ltd decides to send Company's 1000s of assets to be verified.
4. Once the final token was aqcuired by the backend app, how does the backend app know whether the token is still valid? Regardless of expiry time I mean. Should the backend app ask the authentication provider if the token is still valid? Does it need ask it inexorably via API endpoint for each transaction?
5. If the answer to the above question is more or less positive, does it mean I need to build a separate (and big!) infrastructure?
Thanks in advance!
https://redd.it/n68gya
@r_devops
reddit
Oauth flow and its impact on infrastructure
Hello, first post here :) I'm helping with an OAUTH / OpenID connect implementation by designing the infrastructure and the following issue took...
Leave on-prem devops job to pursue cloud?
Hi, could use some advice. I'm an on-prem devops engineer with about 2 years experience. I mostly have learned Ansible, Jenkins, Docker, Linux sysadmin and related things so far at my current job (started as a Junior and now an intermediate).
I like my job (and the people I work with) and I see there's a path for me to grow/become senior devops. My concern is if I should try to switch to a cloud company, as I have no AWS/Azure/GCP experience and due to the nature of my company we never will.
Is it stupid to leave a job I like just so that I can get on the cloud track sooner rather than later? Or is it something I could just learn in my free time? The pay at my current job is fine though if I switched I expect I could get an extra 10-20%.
Not really sure what to do... thanks!
https://redd.it/n62baq
@r_devops
Hi, could use some advice. I'm an on-prem devops engineer with about 2 years experience. I mostly have learned Ansible, Jenkins, Docker, Linux sysadmin and related things so far at my current job (started as a Junior and now an intermediate).
I like my job (and the people I work with) and I see there's a path for me to grow/become senior devops. My concern is if I should try to switch to a cloud company, as I have no AWS/Azure/GCP experience and due to the nature of my company we never will.
Is it stupid to leave a job I like just so that I can get on the cloud track sooner rather than later? Or is it something I could just learn in my free time? The pay at my current job is fine though if I switched I expect I could get an extra 10-20%.
Not really sure what to do... thanks!
https://redd.it/n62baq
@r_devops
reddit
Leave on-prem devops job to pursue cloud?
Hi, could use some advice. I'm an on-prem devops engineer with about 2 years experience. I mostly have learned Ansible, Jenkins, Docker, Linux...
Can I use Github Secrets locally?
So, when I don't do production builds but rather basic local development with my app, a
More context: I could know them ofc but assuming you would have a team of devs, how can they use the secrets for their day-to-day development without actually knowing them secrets?
https://redd.it/n65195
@r_devops
So, when I don't do production builds but rather basic local development with my app, a
Dockerfile and docker-compose up, can I use Github Secrets without knowing the secrets?More context: I could know them ofc but assuming you would have a team of devs, how can they use the secrets for their day-to-day development without actually knowing them secrets?
https://redd.it/n65195
@r_devops
reddit
Can I use Github Secrets locally?
So, when I don't do production builds but rather basic local development with my app, a `Dockerfile` and `docker-compose up`, can I use Github...
Gaming dev industry insight
I have been working as project manager (PM) within ERP for about 9 years as consultant and have managed different type of projects within digitalization which often are cross-functional with a larger group of stakeholders involved and a mix of agile and waterfall dev.
So I have lately been interested of continuing as PM though within the gaming industry and would like to have some insights of how those type of projects looks like on a high level, what type of roles are included, the different project phases, system used for knowledge sharing and tracking (ServiceNow, JIRA etc...).
Thanks in advance! 🙏
https://redd.it/n62pze
@r_devops
I have been working as project manager (PM) within ERP for about 9 years as consultant and have managed different type of projects within digitalization which often are cross-functional with a larger group of stakeholders involved and a mix of agile and waterfall dev.
So I have lately been interested of continuing as PM though within the gaming industry and would like to have some insights of how those type of projects looks like on a high level, what type of roles are included, the different project phases, system used for knowledge sharing and tracking (ServiceNow, JIRA etc...).
Thanks in advance! 🙏
https://redd.it/n62pze
@r_devops
reddit
Gaming dev industry insight
I have been working as project manager (PM) within ERP for about 9 years as consultant and have managed different type of projects within...
Oracle integration with git .
Hello r/DevOps
We had oracle database that we use . We store our schéma in SVN. We are now planning to migrate our code into git. What is the best way to setup git that will track any schema and table update . So the build will build what have changed on the database. we can build with Jenkins to our server.
https://redd.it/n5thor
@r_devops
Hello r/DevOps
We had oracle database that we use . We store our schéma in SVN. We are now planning to migrate our code into git. What is the best way to setup git that will track any schema and table update . So the build will build what have changed on the database. we can build with Jenkins to our server.
https://redd.it/n5thor
@r_devops
reddit
Oracle integration with git .
Hello r/DevOps We had oracle database that we use . We store our schéma in SVN. We are now planning to migrate our code into git. What is the...
Code profiling dashboards of monitoring tools
I already have datadog running along with cloudwatch that has the microservices running in Kubernetes without EKS. I want to have a consolidated dashboard along with code profiling insights, similar to continuous profiler of datadog . I am looking for a cloudwatch dashboard that has the metrics as :
* time spent by method/functions on CPU,
* garbage collection,
* lock contention
* I/O
* Monitor code performance variations in production by applying long-term, code-level metrics to alerts and dashboards
* Compare code behavior and impact across hosts, services, and versions during canary, blue/green, or shadow deploys
* Isolate the most resource-heavy functions to quickly understand what is causing a spike and decide whether to roll back or ship a fix
Though datadog is used in the environment and for continuous profiling dashboard I learnt that it would change$0.10 per compressed GB of log data that is scanned . Could anyone suggest any of such dashboard in aws cloudwatch along with prices.
Would really appreciate if added with two cents on other monitoring tools with low pricing.
https://redd.it/n5suds
@r_devops
I already have datadog running along with cloudwatch that has the microservices running in Kubernetes without EKS. I want to have a consolidated dashboard along with code profiling insights, similar to continuous profiler of datadog . I am looking for a cloudwatch dashboard that has the metrics as :
* time spent by method/functions on CPU,
* garbage collection,
* lock contention
* I/O
* Monitor code performance variations in production by applying long-term, code-level metrics to alerts and dashboards
* Compare code behavior and impact across hosts, services, and versions during canary, blue/green, or shadow deploys
* Isolate the most resource-heavy functions to quickly understand what is causing a spike and decide whether to roll back or ship a fix
Though datadog is used in the environment and for continuous profiling dashboard I learnt that it would change$0.10 per compressed GB of log data that is scanned . Could anyone suggest any of such dashboard in aws cloudwatch along with prices.
Would really appreciate if added with two cents on other monitoring tools with low pricing.
https://redd.it/n5suds
@r_devops
reddit
Code profiling dashboards of monitoring tools
I already have datadog running along with cloudwatch that has the microservices running in Kubernetes without EKS. I want to have a consolidated...
Grafana and AWS CloudWatch Urgent
I want to obtain custom metrics like mem utilization and project them in Grafana
I am successful in configuring custom metrics in cloudwatch but can't able to obtain them in Grafana
The predefined metrics(CPU_Utilization) are working fine in Grafana but when I give metrics as "disk_used_percent" it's saying "no data"
immediate help would be appreciated
https://redd.it/n5j564
@r_devops
I want to obtain custom metrics like mem utilization and project them in Grafana
I am successful in configuring custom metrics in cloudwatch but can't able to obtain them in Grafana
The predefined metrics(CPU_Utilization) are working fine in Grafana but when I give metrics as "disk_used_percent" it's saying "no data"
immediate help would be appreciated
https://redd.it/n5j564
@r_devops
reddit
r/devops - Grafana and AWS CloudWatch [Urgent]
1 vote and 2 comments so far on Reddit
Direktiv: serverless custom plugins in Go, Java, Node, Python or Rust (or anything)
G'day DevOps,
We previously posted about our open-source project for serverless workflows called Direktiv. Since then we've added examples on how to integrate and run your own plugins / containers in the platform. We've added examples to the GitHub repo:
https://github.com/vorteil/direktiv-apps/tree/master/examples
We've also written an article which we hope would help:
https://blog.direktiv.io/direktiv-serverless-custom-plugins-in-go-java-node-python-or-rust-or-anything-1b41a257af91
Any feedback as always welcomed and helpful!
https://redd.it/n6jo5m
@r_devops
G'day DevOps,
We previously posted about our open-source project for serverless workflows called Direktiv. Since then we've added examples on how to integrate and run your own plugins / containers in the platform. We've added examples to the GitHub repo:
https://github.com/vorteil/direktiv-apps/tree/master/examples
We've also written an article which we hope would help:
https://blog.direktiv.io/direktiv-serverless-custom-plugins-in-go-java-node-python-or-rust-or-anything-1b41a257af91
Any feedback as always welcomed and helpful!
https://redd.it/n6jo5m
@r_devops
GitHub
vorteil/direktiv-apps
Applications that are used in Direktiv. Contribute to vorteil/direktiv-apps development by creating an account on GitHub.
Projects to work on in personal time?
Hi guys,
I'm starting to get into DevOps, currently in help desk. I'm working on my Linux+. I'd like to do some projects on the side to get more familiar with aspects of devops in practice, but I have no idea where to start, or the projects I look up sound like gibberish. I need an ELI5 project, lol. Anyone have any ideas/links?
https://redd.it/n5i9ph
@r_devops
Hi guys,
I'm starting to get into DevOps, currently in help desk. I'm working on my Linux+. I'd like to do some projects on the side to get more familiar with aspects of devops in practice, but I have no idea where to start, or the projects I look up sound like gibberish. I need an ELI5 project, lol. Anyone have any ideas/links?
https://redd.it/n5i9ph
@r_devops
reddit
Projects to work on in personal time?
Hi guys, I'm starting to get into DevOps, currently in help desk. I'm working on my Linux+. I'd like to do some projects on the side to get more...
What tools are you using for log/event monitoring?
I just started working for a company that wants to make better log monitoring/management tools for DevOps individuals/teams. They have great engineers, but all come from enterprise backgrounds so they don't know exactly what smaller teams need.
I always worked solo in web dev so I used pretty basic setups/tools. It was simple enough for me to jump into logs directly to investigate. So I also don't have a lot of experience here.
I was wondering 1) what tools are you using that you love, and 2) what is still missing from those tools? I'm hoping your feedback will help point our team in the right direction!
https://redd.it/n5gt1o
@r_devops
I just started working for a company that wants to make better log monitoring/management tools for DevOps individuals/teams. They have great engineers, but all come from enterprise backgrounds so they don't know exactly what smaller teams need.
I always worked solo in web dev so I used pretty basic setups/tools. It was simple enough for me to jump into logs directly to investigate. So I also don't have a lot of experience here.
I was wondering 1) what tools are you using that you love, and 2) what is still missing from those tools? I'm hoping your feedback will help point our team in the right direction!
https://redd.it/n5gt1o
@r_devops
reddit
What tools are you using for log/event monitoring?
I just started working for a company that wants to make better log monitoring/management tools for DevOps individuals/teams. They have great...
enter the abattoir - Building
I wrote this about our journey to microservices at Achievers. We wanted to create a self-serve deployment model to allow engineers to own their service in production.
https://achievers.engineering/enter-the-abattoir-ee5e2019f0b3
https://redd.it/n5fkql
@r_devops
a la carte gitops tooling for microservicesI wrote this about our journey to microservices at Achievers. We wanted to create a self-serve deployment model to allow engineers to own their service in production.
https://achievers.engineering/enter-the-abattoir-ee5e2019f0b3
https://redd.it/n5fkql
@r_devops
Medium
Enter the Abattoir
Building `a la carte` gitops tooling
Suggestions on what services to choose for migrating to AWS
Hi all.
I'm interested to migrate my application to the cloud and specifically to AWS. The thing is that I've never before had an experience with these platforms and how to choose based on your needs.
So here is a brief technical description of my app. It is just a JAVA spring boot API that uses Maria DB and Couchbase for data storage. Some days ago, I was reading the documentation of the Chouchbase their suggestion about an r4.4xlarge instance only for hosting the Couchbase.
So based on my little knowledge of that kind of stuff, I thought to host the rest of the app within an a1.4xlarge general purpose instance.
One more piece of info that could be helpful to decide is probably the traffic of the app. So based on some exaggerated calculations there could reach 4.750.000 requests/month.
Based on all these are the above instances wise choices? How would you approach this migration? Any ideas, tutorials, and tips are very welcomed.
https://redd.it/n5dv9b
@r_devops
Hi all.
I'm interested to migrate my application to the cloud and specifically to AWS. The thing is that I've never before had an experience with these platforms and how to choose based on your needs.
So here is a brief technical description of my app. It is just a JAVA spring boot API that uses Maria DB and Couchbase for data storage. Some days ago, I was reading the documentation of the Chouchbase their suggestion about an r4.4xlarge instance only for hosting the Couchbase.
So based on my little knowledge of that kind of stuff, I thought to host the rest of the app within an a1.4xlarge general purpose instance.
One more piece of info that could be helpful to decide is probably the traffic of the app. So based on some exaggerated calculations there could reach 4.750.000 requests/month.
Based on all these are the above instances wise choices? How would you approach this migration? Any ideas, tutorials, and tips are very welcomed.
https://redd.it/n5dv9b
@r_devops
reddit
Suggestions on what services to choose for migrating to AWS
Hi all. I'm interested to migrate my application to the cloud and specifically to AWS. The thing is that I've never before had an experience with...
Strategy for alerting on database when query results differ b/w DB
What would be the most standard way of alerting if some values in database vary from environment to environment? I would like to have a high level idea of the tooling/components required to do this, ideally FOSS.
I've seen alerts on single query using Redash but i doubt i would be able to correlate the results of queries to different databases.
EDIT I'm also struggling to find the right combination of search keywords to properly google and get relevant results /EDIT
Context:
I have applications running in different environments with rds or aurora databases depending on the application. I would like to be alerted if there is different results for the same query in different environments.
https://redd.it/n6ib3l
@r_devops
What would be the most standard way of alerting if some values in database vary from environment to environment? I would like to have a high level idea of the tooling/components required to do this, ideally FOSS.
I've seen alerts on single query using Redash but i doubt i would be able to correlate the results of queries to different databases.
EDIT I'm also struggling to find the right combination of search keywords to properly google and get relevant results /EDIT
Context:
I have applications running in different environments with rds or aurora databases depending on the application. I would like to be alerted if there is different results for the same query in different environments.
https://redd.it/n6ib3l
@r_devops
reddit
Strategy for alerting on database when query results differ b/w DB
What would be the most standard way of alerting if some values in database vary from environment to environment? I would like to have a high level...
How to make all my traffic HTTPS even if they attack the ip?
I have a server running apache and I have an angular app running on port 3000. Then that is bind to the port 80 and 443 thanks to the Apache configurations and makes the redirection to https, but if I attack the ip directly it doesn't do it.
How can this be done?
(Hopefully I was clear enough)
https://redd.it/n6hqkj
@r_devops
I have a server running apache and I have an angular app running on port 3000. Then that is bind to the port 80 and 443 thanks to the Apache configurations and makes the redirection to https, but if I attack the ip directly it doesn't do it.
How can this be done?
(Hopefully I was clear enough)
https://redd.it/n6hqkj
@r_devops
reddit
How to make all my traffic HTTPS even if they attack the ip?
I have a server running apache and I have an angular app running on port 3000. Then that is bind to the port 80 and 443 thanks to the Apache...