Making simple Nmap SPA web GUI with Apache, AngularJS and Python Twisted
The last time I was developing dynamic web applications years ago. I used CGI and #PHP back then. 🙂 Now I am really interested in a modern approach, when you have a Single Page Web Application (SPA) written in HTML and #JavaScript, that makes http requests to some external #API.
It’s pretty cool, because your application becomes API-centric naturally. You work on human interface and improve integration capabilities at the same time. And the task of securing your web app mostly reduces to securing your formalized #API.
The very best way to learn something new is to write a post about this stuff. 😉 Here I will reproduce my own steps of making a very basic web app:
1. Launch #Apache web-server with http/https.
2. Make a simple #API service: #Nmap wrapper.
3. Make a web-application with “multipage” experience. There should be at least two pages: Scan and About.
4. On Scan page it will be possible to input a target (hostname or IP), #scan arguments and launch #scan by clicking on the button. The same behavior will be if the target will be passed as a parameter in address bar.
5. On other pages should be some static text.
As you can see, it is a very limited task, but it should clear up the most confusing parts of the process.
#Twisted #SSL #python #Nmap #nginx #JavaScript #GoogleChrome #Firefox #CORS #Apache #AngularJS #API
Read more: https://avleonov.com/2018/02/05/making-simple-nmap-spa-web-gui-with-apache-angularjs-and-python-twisted/
The last time I was developing dynamic web applications years ago. I used CGI and #PHP back then. 🙂 Now I am really interested in a modern approach, when you have a Single Page Web Application (SPA) written in HTML and #JavaScript, that makes http requests to some external #API.
It’s pretty cool, because your application becomes API-centric naturally. You work on human interface and improve integration capabilities at the same time. And the task of securing your web app mostly reduces to securing your formalized #API.
The very best way to learn something new is to write a post about this stuff. 😉 Here I will reproduce my own steps of making a very basic web app:
1. Launch #Apache web-server with http/https.
2. Make a simple #API service: #Nmap wrapper.
3. Make a web-application with “multipage” experience. There should be at least two pages: Scan and About.
4. On Scan page it will be possible to input a target (hostname or IP), #scan arguments and launch #scan by clicking on the button. The same behavior will be if the target will be passed as a parameter in address bar.
5. On other pages should be some static text.
As you can see, it is a very limited task, but it should clear up the most confusing parts of the process.
#Twisted #SSL #python #Nmap #nginx #JavaScript #GoogleChrome #Firefox #CORS #Apache #AngularJS #API
Read more: https://avleonov.com/2018/02/05/making-simple-nmap-spa-web-gui-with-apache-angularjs-and-python-twisted/
Kenna Security: Connectors and REST API
In the last post about #Kenna Security cloud service I mentioned their main features for analyzing data from different #vulnerability scanners. Now let’s see how to import #Tenable #Nessus #scan results in #Kenna. Here you can see the list of connectors for all supported products:
Three connectors for #Nessus are available:
* **Nessus Importer** retrieves existing #scan results from your #Nessus server.
* **Nessus Scanner** can schedule scans on your #Nessus server.
* **Nessus XML** imports #xml (.Nessus2) files.
First two connectors work with #Nessus server directly. And they probably won’t work anymore with #Nessus Professional 7, because of #API removing (see “New #Nessus 7 Professional and the end of cost-effective #VulnerabilityManagement (as we knew it)“). If #Nessus server is deployed on-premise you should use special #Kenna Virtual Tunnel.
Last “Nessus XML” connector is the most flexible. No matter how you got your #scan results, it will be possible to import them to #Kenna. See how to get XML reports from from #Nessus server in a post “Retrieving #scan results through #Nessus API“. You can upload XML #scan results using #Kenna web GUI (not very efficient way, but for testing – why not?) or REST #API.
To use #Kenna REST #API you will need an Application Token. Go to the the Settings menu -> Applications:
#xml #Tenable #python #Nessus #Kenna #VulnerabilityManagement #API
Read more: https://avleonov.com/2018/02/15/kenna-security-connectors-and-rest-api/
In the last post about #Kenna Security cloud service I mentioned their main features for analyzing data from different #vulnerability scanners. Now let’s see how to import #Tenable #Nessus #scan results in #Kenna. Here you can see the list of connectors for all supported products:
Three connectors for #Nessus are available:
* **Nessus Importer** retrieves existing #scan results from your #Nessus server.
* **Nessus Scanner** can schedule scans on your #Nessus server.
* **Nessus XML** imports #xml (.Nessus2) files.
First two connectors work with #Nessus server directly. And they probably won’t work anymore with #Nessus Professional 7, because of #API removing (see “New #Nessus 7 Professional and the end of cost-effective #VulnerabilityManagement (as we knew it)“). If #Nessus server is deployed on-premise you should use special #Kenna Virtual Tunnel.
Last “Nessus XML” connector is the most flexible. No matter how you got your #scan results, it will be possible to import them to #Kenna. See how to get XML reports from from #Nessus server in a post “Retrieving #scan results through #Nessus API“. You can upload XML #scan results using #Kenna web GUI (not very efficient way, but for testing – why not?) or REST #API.
To use #Kenna REST #API you will need an Application Token. Go to the the Settings menu -> Applications:
#xml #Tenable #python #Nessus #Kenna #VulnerabilityManagement #API
Read more: https://avleonov.com/2018/02/15/kenna-security-connectors-and-rest-api/
Masking Vulnerability Scan reports
Continuing the series of posts about #Kenna (“Analyzing Vulnerability Scan data“, “Connectors and REST API“) and similar services. Is it actually safe to send your #vulnerability data to some external cloud service for analysis? Leakage of such information can potentially cause great damage to your organization, right?
It’s once again a problem of trust to vendor. IMHO, in some cases it may make sense to hide the real hostnames and ip-addresses of the target hosts in #scan reports. So, it would be clear for analysis vendor that some critical #vulnerability exists somewhere, but it would not be clear where exactly.
To do this, each hostname/ip-address should be replaced to some values of similar type and should be replaced on the same value each time. So the algorithms of Kenna-like service could work with this masked reports. This mean that we need to create a replacement dictionary.
#xml #python #Nessus #masking #Kenna #json #VulnerabilityManagement #Concept
Read more: https://avleonov.com/2018/02/22/masking-vulnerability-scan-reports/
Continuing the series of posts about #Kenna (“Analyzing Vulnerability Scan data“, “Connectors and REST API“) and similar services. Is it actually safe to send your #vulnerability data to some external cloud service for analysis? Leakage of such information can potentially cause great damage to your organization, right?
It’s once again a problem of trust to vendor. IMHO, in some cases it may make sense to hide the real hostnames and ip-addresses of the target hosts in #scan reports. So, it would be clear for analysis vendor that some critical #vulnerability exists somewhere, but it would not be clear where exactly.
To do this, each hostname/ip-address should be replaced to some values of similar type and should be replaced on the same value each time. So the algorithms of Kenna-like service could work with this masked reports. This mean that we need to create a replacement dictionary.
#xml #python #Nessus #masking #Kenna #json #VulnerabilityManagement #Concept
Read more: https://avleonov.com/2018/02/22/masking-vulnerability-scan-reports/
Converting Nmap xml scan reports to json
Unfortunately, #Nmap can not save the results in #json. All available output options:
-oN (normal output)
-oX (XML output)
-oS (ScRipT KIdd|3 oUTpuT)
-oG (grepable output)
-oA (Output to all formats)
And processing #xml results may not be easy an easy task. Just look how I analyze the contents of the #Nessus report in “Parsing #Nessus v2 XML reports with python“. Not the most readable code, right? And what alternatives do we have?
Formal XML to #json conversion is impossible. Formats are very different. However, there are #python modules, for example #xmltodict, that can reliably convert XML into Python structures of dictionaries, lists and strings. However, they have to change some names of parameters to avoid collisions. In my opinion this is not a big price for convenience.
So, let’s see how this will work for #Nmap command:
`nmap -sV -oX nmap_output.xml avleonov.com 1>/dev/null 2>/dev/null`
#xmltodict #xml #ServiceDetection #python #PortScanning #Nmap #json #CPE #VulnerabilityManagement
Read more: https://avleonov.com/2018/03/11/converting-nmap-xml-scan-reports-to-json/
Unfortunately, #Nmap can not save the results in #json. All available output options:
-oN (normal output)
-oX (XML output)
-oS (ScRipT KIdd|3 oUTpuT)
-oG (grepable output)
-oA (Output to all formats)
And processing #xml results may not be easy an easy task. Just look how I analyze the contents of the #Nessus report in “Parsing #Nessus v2 XML reports with python“. Not the most readable code, right? And what alternatives do we have?
Formal XML to #json conversion is impossible. Formats are very different. However, there are #python modules, for example #xmltodict, that can reliably convert XML into Python structures of dictionaries, lists and strings. However, they have to change some names of parameters to avoid collisions. In my opinion this is not a big price for convenience.
So, let’s see how this will work for #Nmap command:
`nmap -sV -oX nmap_output.xml avleonov.com 1>/dev/null 2>/dev/null`
#xmltodict #xml #ServiceDetection #python #PortScanning #Nmap #json #CPE #VulnerabilityManagement
Read more: https://avleonov.com/2018/03/11/converting-nmap-xml-scan-reports-to-json/
How to correlate different events in Splunk and make dashboards
Recently I spent some time dealing with #Splunk. Despite the fact that I have already done various #Splunk searches before, for example in “Tracking software versions using #Nessus and Splunk“, the correlation of different events in #Splunk seems to be a very different task. And there not so many publicly available examples of this on the Internet. So, I decided to write a small post about it myself.
Disclaimer: I’m not a pro in #Splunk. I don’t have an idea if I am doing this the right or in optimal way. 😉 I just learned some tricks, they worked for me well and I want to share it with you.
I will show the following case:
1. We have some active network hosts.
2. Some software product should be installed these hosts.
3. We will send “host X is active” and “software is installed on host X” events to the #Splunk server.
4. We want to get some diagrams in #Splunk that will show us on which hosts the software is installed and how number of such hosts is changing in time.
As you can see, the task is quite a trivial and it can be easily implemented in pure Python. But the idea is to make it in #Splunk. 😉
#Splunk #python #json #SIEM #Concept
Read more: https://avleonov.com/2018/07/19/how-to-correlate-different-events-in-splunk-and-make-dashboards/
Recently I spent some time dealing with #Splunk. Despite the fact that I have already done various #Splunk searches before, for example in “Tracking software versions using #Nessus and Splunk“, the correlation of different events in #Splunk seems to be a very different task. And there not so many publicly available examples of this on the Internet. So, I decided to write a small post about it myself.
Disclaimer: I’m not a pro in #Splunk. I don’t have an idea if I am doing this the right or in optimal way. 😉 I just learned some tricks, they worked for me well and I want to share it with you.
I will show the following case:
1. We have some active network hosts.
2. Some software product should be installed these hosts.
3. We will send “host X is active” and “software is installed on host X” events to the #Splunk server.
4. We want to get some diagrams in #Splunk that will show us on which hosts the software is installed and how number of such hosts is changing in time.
As you can see, the task is quite a trivial and it can be easily implemented in pure Python. But the idea is to make it in #Splunk. 😉
#Splunk #python #json #SIEM #Concept
Read more: https://avleonov.com/2018/07/19/how-to-correlate-different-events-in-splunk-and-make-dashboards/
Sending FireEye HX data to Splunk
#FireEye HX is an agent-based #EndpointProtection solution. Something like an #antivirus, but focused on Advanced Persistent Threats (APT). It has an appliance with #GUI where you can manage the agents and see information about detected security incidents.
As with any agent-based solution, it’s necessary to ensure that the agents are installed on every supported host in your network. You may also want to analyze the alerts automatically. And for both purposes you can use #Splunk. Let’s see how to do it. 😉
Note, everything bellow is for FireEye Endpoint Security (HX) 4.0.6 and Splunk 7.0.2. If you use some other version, the things may be quite different.
The main idea is following. We should present FireEye hosts and alerts data in JSON format, add some mandatory fields ans send this packages to Splunk using HTTP Event connector. Then we can process it in Splunk like I've shown in "How to correlate different events in Splunk and make dashboards".
#Splunk #python #FireEyeHX #FireEye #curl #bash #APT #SIEM #EndpointProtection
Read more: https://avleonov.com/2018/07/29/sending-fireeye-hx-data-to-splunk/
#FireEye HX is an agent-based #EndpointProtection solution. Something like an #antivirus, but focused on Advanced Persistent Threats (APT). It has an appliance with #GUI where you can manage the agents and see information about detected security incidents.
As with any agent-based solution, it’s necessary to ensure that the agents are installed on every supported host in your network. You may also want to analyze the alerts automatically. And for both purposes you can use #Splunk. Let’s see how to do it. 😉
Note, everything bellow is for FireEye Endpoint Security (HX) 4.0.6 and Splunk 7.0.2. If you use some other version, the things may be quite different.
The main idea is following. We should present FireEye hosts and alerts data in JSON format, add some mandatory fields ans send this packages to Splunk using HTTP Event connector. Then we can process it in Splunk like I've shown in "How to correlate different events in Splunk and make dashboards".
#Splunk #python #FireEyeHX #FireEye #curl #bash #APT #SIEM #EndpointProtection
Read more: https://avleonov.com/2018/07/29/sending-fireeye-hx-data-to-splunk/
Sending tables from Atlassian Confluence to Splunk
Sometimes when we make automated analysis with #Splunk, it might be necessary to use information that was entered or edited manually. For example, the classification of network hosts: do they belong to the PCI-DSS Scope or another group critical hosts or not.
In this case, Confluence can be quite a convenient tool for maintaining such a registry. Page with a #table can be created very quickly and multiple employees can immediately start working with it.
Let’s see how to convert such #table, export it to #Splunk and use it with other data.
#table #Splunk #python #PCIDSS #html #AtlassianConfluence #Atlassian #SIEM #API
Read more: https://avleonov.com/2018/08/04/sending-tables-from-atlassian-confluence-to-splunk/
Sometimes when we make automated analysis with #Splunk, it might be necessary to use information that was entered or edited manually. For example, the classification of network hosts: do they belong to the PCI-DSS Scope or another group critical hosts or not.
In this case, Confluence can be quite a convenient tool for maintaining such a registry. Page with a #table can be created very quickly and multiple employees can immediately start working with it.
Let’s see how to convert such #table, export it to #Splunk and use it with other data.
#table #Splunk #python #PCIDSS #html #AtlassianConfluence #Atlassian #SIEM #API
Read more: https://avleonov.com/2018/08/04/sending-tables-from-atlassian-confluence-to-splunk/
Asset Inventory for Network Perimeter: from Declarations to Active Scanning
In the previous post, I shared some of my thoughts about the good #AssetInventory system. Of course, for me as a Security Specialist, it would be great if IT will provide such magical system. 🙂 But such an ideal situation is rarely possible. So now let’s see how to build an #AssetInventory system using the resources of Information Security team.
There are no special secrets. It’s necessary to get information about the assets from all available IT systems and then get the rest of the data using our own Assessment tools. I would like to start with hosts on Network Perimeter. The Network Perimeter targets are available at any time for hacker attacks, that’s why this part of the network is the most critical.
Perimeter is changing constantly. And we should understand at any time what hosts are currently exposed in every office and every external hosting platform.
We can get information about external hosts using some Vulnerability Scanner located on external host in the Internet. I have already wrote about it briefly in #VulnerabilityManagement for Network Perimeter. Here I would like focus on how we can understand which hosts should be scanned and what useful information we can get from the raw #scan results.
#Tenable #python #Nessus #MSWord #MSExcel #DNS #AtlassianConfluence #VulnerabilityManagement #PerimeterService #Concept
Read more: https://avleonov.com/2018/08/16/asset-inventory-for-network-perimeter-from-declarations-to-active-scanning/
In the previous post, I shared some of my thoughts about the good #AssetInventory system. Of course, for me as a Security Specialist, it would be great if IT will provide such magical system. 🙂 But such an ideal situation is rarely possible. So now let’s see how to build an #AssetInventory system using the resources of Information Security team.
There are no special secrets. It’s necessary to get information about the assets from all available IT systems and then get the rest of the data using our own Assessment tools. I would like to start with hosts on Network Perimeter. The Network Perimeter targets are available at any time for hacker attacks, that’s why this part of the network is the most critical.
Perimeter is changing constantly. And we should understand at any time what hosts are currently exposed in every office and every external hosting platform.
We can get information about external hosts using some Vulnerability Scanner located on external host in the Internet. I have already wrote about it briefly in #VulnerabilityManagement for Network Perimeter. Here I would like focus on how we can understand which hosts should be scanned and what useful information we can get from the raw #scan results.
#Tenable #python #Nessus #MSWord #MSExcel #DNS #AtlassianConfluence #VulnerabilityManagement #PerimeterService #Concept
Read more: https://avleonov.com/2018/08/16/asset-inventory-for-network-perimeter-from-declarations-to-active-scanning/
Asset Inventory for Internal Network: problems with Active Scanning and advantages of Splunk
In the previous post, I was writing about Asset Inventory and Vulnerability Scanning on the Network Perimeter. Now it’s time to write about the Internal Network.
There is a common belief that we can use Active Network Scanning for #AssetInventory in the organization. Currently, I’m not a big fan of this approach, and I will try to explain here the disadvantages of this method and mention some alternatives.
#TrendMicro #Tenable #subnet #Splunk #Qualys #python #Nessus #McAfee #Kaspersky #firewall #FireEyeHX #FireEye #CiscoISE #Cisco #VulnerabilityManagement #SIEM #API
Read more: https://avleonov.com/2018/08/20/asset-inventory-for-internal-network-problems-with-active-scanning-and-advantages-of-splunk/
In the previous post, I was writing about Asset Inventory and Vulnerability Scanning on the Network Perimeter. Now it’s time to write about the Internal Network.
There is a common belief that we can use Active Network Scanning for #AssetInventory in the organization. Currently, I’m not a big fan of this approach, and I will try to explain here the disadvantages of this method and mention some alternatives.
#TrendMicro #Tenable #subnet #Splunk #Qualys #python #Nessus #McAfee #Kaspersky #firewall #FireEyeHX #FireEye #CiscoISE #Cisco #VulnerabilityManagement #SIEM #API
Read more: https://avleonov.com/2018/08/20/asset-inventory-for-internal-network-problems-with-active-scanning-and-advantages-of-splunk/
Retrieving IT Asset lists from NetBox via API
A little bit more about IT #AssetInventory of Internal Network, that your IT team can provide. 😉
I have recently worked with #NetBox – an open source IP address management (IPAM) and data center infrastructure management (DCIM) solution developed by well-known cloud hosting provider #DigitalOcean.
It’s not really about security, not even a #CMDB. But, security team still might be interested in #NetBox, because it makes possible to track the hosts in some critical #subnet without active scanning, providing great visibility of assets. Here I will show a small example of #NetBox #API usage.
#Splunk #python #NetBox #IPAM #DigitalOcean #DCIM #AssetInventory #API
Read more: https://avleonov.com/2018/09/05/retrieving-it-asset-lists-from-netbox-via-api/
A little bit more about IT #AssetInventory of Internal Network, that your IT team can provide. 😉
I have recently worked with #NetBox – an open source IP address management (IPAM) and data center infrastructure management (DCIM) solution developed by well-known cloud hosting provider #DigitalOcean.
It’s not really about security, not even a #CMDB. But, security team still might be interested in #NetBox, because it makes possible to track the hosts in some critical #subnet without active scanning, providing great visibility of assets. Here I will show a small example of #NetBox #API usage.
#Splunk #python #NetBox #IPAM #DigitalOcean #DCIM #AssetInventory #API
Read more: https://avleonov.com/2018/09/05/retrieving-it-asset-lists-from-netbox-via-api/