Thursday, May 28, 2020

Deployment time security audit using CloudFormation custom resource.


To prevent deployment of the potentially sensitive resources or infrastructure into the AWS account that might not meet current organizational security standard we can use AWS CloudFormation custom resource to perform quick security audit (or kind of sanity check) of the cloud account before processing with deployment.

Why we need this if we can scan/audit account as a part of the CI/CD pipeline? For the cases when deployments are performed manually or to have CI/CD independent "portable" CloudFormation template that has all security checks built-in and not bolt-on.

How it will look like:

  1. To you normal CloudFormation template you will add a custom resource.
  2. This custom resource it technically speaking a Lambda function that created and called during CloudFormation stack deployment.
  3. This Lambda function will perform quick (to meet CloudFormation deployment timeouts restrictions) security audit of the account where template going to be deployed
  4. As result of this audit Lambda will return status that will be interpreted by CloudFormation as a resource creation outcome.
  5. If AWS environment pass security check - deployment of other resources in you stack continue as usual
  6. If AWS environment fail security check - stack deployment will interrupted and rolled back as a result of the custom resource failure. 
I will publish example on such functionality on my Github shorty and will update this post with more details. 

Sunday, May 24, 2020

Nmap.me new version - now with vulnerability scan


Now service:
  • Does full port and vulnerability scan of the caller's IP
  • Immediately perform and return tcp scan results. Starts vulnerability scan in background.
  • Visit nmap.me again in about 30min(depends on load), and results of the vulnerability scan (port, service, vulnerability, CVE) for you IP will be displayed. 
  • You can use, console friendly endpoint : scan.nmap.me , or
  •  use a human readable website: nmap.me
Scan results examples:

Good  scan results:

Not so good scan  results:



As previously service is built using AWS native capabilities, serverless and containerized approach and design to be extremely scalable.



Quick FAQ:

What's new? Now it does full vulnerability scan of the IP.

What it does? Scan your external IP for open TCP ports and known vulnerabilities.
How to use? Simply do _curl scan.nmap.me_ from your console/terminal or open this webite in browser. You will get TCP scan results immediately and you will need to visit same page in an hour to get vulnerability scan results.

What it's scanning for?: Uses nmap NSE script to perform scan for known vulnerabilities. Based on https://github.com/vulnersCom/nmap-vulners

How fast? Whole TCP scan takes about a few second and results are immediately shown. After this vuln scan is starting. Depending of the backend load it might take about an hour to get scanned. After this simply visit the same page again to get vulnerability scan results. Scan results for each requester IP are cached for 1 hour (TCP scan) and 24 hours (Vuln scan) to reduce load and prevent abuse.

Friday, May 15, 2020

Using MFA with AWS CLI

It quite obvious nowadays that you must use MFA if it's available.
Enabling MFA for your user account in AWS IAM will automatically enforce it for the AWS Web UI login. 

But what about AWS CLI, your code using AWS SDK and 3d party SDK based tools?
In this case, to leverage MFA you need to enforce it using "Condition" statement for the IAM policy assigned to you user as it described in following AWS manual:
https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-role.html

In nutshell something like this:

Enforce MFA for the assume role:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "", "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::123456789012:user/anika" }, "Action": "sts:AssumeRole", "Condition": { "Bool": { "aws:multifactorAuthPresent": true } } } ] }

Add MFA to you AWS CLI profile:

[profile role-with-mfa] region = us-west-2 role_arn= arn:aws:iam::128716708097:role/cli-role source_profile = cli-user mfa_serial = arn:aws:iam::128716708097:mfa/cli-user

Simple? Not exactly - here some tricky things that not covered by AWS documentation (at least I was not able to find).

1. AWS documentation a bit misleading: in the AIM statement and documentation user name  mentioned is Anika but all CLI configs are pointing to the non existing cli profile "cli-user"

2. AWS CLI MFA configuration will work ONLY when you are assuming Role. Yep, if you have one simple account, few users  and groups (as many small companies do) you can't leverage this functionality without some small trick(item 3)

3. You can still leverage MFA with CLI using role:

  • Strip all access from the user you are using to login, except "assume role", or alternatively,  enforce the MFA for all the actions using condition from the above.  
Note:
If you will strip all permissions you will need to assume role even if you are using WEB UI.
If you use alternative approach and enforce MFA for all API actions you can keep using WEB UI without assuming role the same way as you was doing before.
  • Create a role (exampe: MyOrganizationAccountAccessRole) to assume in the same account with MFA enforced and all required access rights. If you have more than one account - create this role in other accounts as well with the same MFA enforcement condition.
  • Create extra profile my-account-mfa (in addition to the main account profile my_account ) for the accessing the same account (my-account) using this role: 
[profile my-account-mfa]
role_arn = arn:aws:iam:: 123456789:role/MyOrganizationAccountAccessRole
source_profile = my_account
mfa_serial = arn:aws:iam:: 123456789:mfa/it-security@ca

[profile my_account]
output = json
region = us-east-1
mfa_serial = arn:aws:iam::123456789:mfa/it-security@ca

[profile my_second_account]
ole_arn = arn:aws:iam:: 987654321:role/MyOrganizationAccountAccessRole
source_profile = my_account
mfa_serial = arn:aws:iam:: 123456789:mfa/it-security@ca

Note : all profile reference  my_account profile as a source


If needed create an extra profile(my_second_account) for any other account you need to access using the role.

Use  profile  my-account-mfa for you CLI access to the main account or for any tools. You will see MFA request and after providing MFA everything will work like a charm.

Enjoy and stay secure!

Wednesday, December 4, 2019

Nmap.me completetly rebuilded

To improve performance and service scalability nmap.me was completely rebuild leveraging aws native services and serverless approach.

Now service:
  • support both http and https
  • has dedicated scanning endpoint: scan.nmap.me
  • has a human readable website: nmap.me
  • scanning endpoint now api driven and will support rest api calls for advanced functionality
  • serverless and scalable
Main functionality so far unchanged: 
What it does? TCP Scan of you external IP.
What it scanning for: 100 most used tcp ports. Actually a bit more than 100 - I'm slowly adding more ports.
How to use: simply curl scan.nmap.me from your console/terminal or open it in browser or visit nmap.me(javascript will trigger scan)
How fast: whole scan takes about a second. Results for each requester IP are cached for 1 hour to reduce load and prevent abuse.

Why? Needed quick way to check open ports on server/gateway/fw/router while being inside the console.

Saturday, May 4, 2019

Awesome list of Native AWS logging capabilities

While looking on centralized logging  capabilities of AWS and going trough bunch of documentation, I noticed lack of the one "big table" where I can find all AWS native logging capabilities per each service and up-to-date service coverage for AWS cloudwatch logs service.
Building I big table is not really version control friendly, so please welcome:

Awesome list of Native AWS logging capabilities:
https://github.com/IhorKravchuk/it-security/blob/master/AWS_logging.md

While I was building this list, some service have already changed their capabilities causing some information in the list being out-of-sync.
 I'll try my best to regularly review existing services and keep adding new one, but if you find mistake or would like to contribute feel free to contact me or create a PR.

Friday, April 12, 2019

Using Terraform to create project and users required in GCP and GSuite

This article is more like quick HOWTO/QuickNote page to start using Terraform with GCP, grant required permission, connect Terraform to GSuite and create users and projects using Terraform.

Connect Terraform to GCP:

1. Download and install Google Cloud SDK: https://cloud.google.com/sdk/install

2. Initialize SDK gcloud init 
 This process will launch browser-based authorization flow  https://cloud.google.com/sdk/docs/initializing

3. Use browser to create  project, service account and download credentials: https://cloud.google.com/sdk/docs/authorizing Note: You need to have GCP billing account and payment method configured first. You can use cli as well:

gcloud projects list
gcloud beta billing accounts list
gcloud beta billing projects link infosec-gcp --billing-account 01122-74525-1222
gcloud config list
gcloud iam service-accounts create infosec-terraform --display-name "Infosec Terraform admin account"
gcloud iam service-accounts keys create ~/.config/gcloud/infosec-terraform-admin.json --iam-account infosec-terraform@infosec-gcp.iam.gserviceaccount.com

4. Give appropriate permissions to the Terraform:
get you organization id
gcloud organizations list

Enable iam api (yes you need to enable each api set you are planning to use with GCP, they are disabled by default) you can check what services are enabled using gcloud services list --available
gcloud services enable iam.googleapis.com

Check existing IAM policies in you org:
gcloud organizations get-iam-policy ORGANIZATION_ID

Grant all required permissions(example):
gcloud organizations add-iam-policy-binding ORGANIZATION_ID --member serviceAccount:infosec-terraform@infosec-gcp.iam.gserviceaccount.com --role roles/resourcemanager.projectCreator

gcloud organizations add-iam-policy-binding ORGANIZATION_ID --member serviceAccount:infosec-terraform@infosec-gcp.iam.gserviceaccount.com --role roles/billing.user

gcloud organizations add-iam-policy-binding ORGANIZATION_ID --member serviceAccount:infosec-terraform@infosec-gcp.iam.gserviceaccount.com --role roles/owner

5. Start using terraform from my example to create project and grant access to it.

The only missing part is actually users.
Connecting Terraform to GSuite:

Why do we need GSuite at all? GCP does not provide you any built-in identity and rely on the user identities from Gmail, GSuite or Google Cloud Identity (+ service accounts)

As AWS user I do really love to have user/groups management and infra/project creation using the same automation tool. Unfortunately, user/GSuite functionality is not provided by GCP Terraform provider. Luckily, there is pretty nice open-sourced Terraform provider for GSuite writtend by DeviaVir: https://github.com/DeviaVir/terraform-provider-gsuite
At the moment when I tested it, some group membership functionality was still lacking idempotency, but  using the way from my example everything started to work like a charm.

So the code finally:
https://github.com/IhorKravchuk/it-security/tree/master/GCP


PS.
Way more details and examples are in the articles below :
https://medium.com/@josephbleroy/using-terraform-with-google-cloud-platform-part-1-n-6b5e4074c059
https://cloud.google.com/community/tutorials/managing-gcp-projects-with-terraform
https://cloud.google.com/community/tutorials/getting-started-on-gcp-with-terraform

Tuesday, February 26, 2019

Revamping AWS APIs' security review and SCP policy generation process.

AWS Cloud provides endless amount of the capabilities and services. Unleashing all this power on the without proper security review process is extremely risky.
Each service and quite often even each api call should be reviewed and evaluated according to the organizational security standards  and compliance requirements. Yes, but.. curently AWS has about 170 services and endless amount of APIs. AWS constantly evolves, introduce new services, APIs and modifying existing.
One of the biggest challenge for me was finding a way to automatically fetch up-to-date annotated  list of the services and api provided by AWS. Luckily, Matt Weagle suggested to use AWS GO SDK as a source of truth. This SDK provides well documented lists of the AWS APIs (docs-2.json)

I crafted small python program that builds/updates following yaml files (one per each service) using json files as a source:

guardduty:
  description: Assess, monitor, manage, and remediate security issues across your
    AWS infrastructure, applications, and data.
  links: [http://guardduty.docs.here]
  security_risk: Cloud IDS
  Allowed_on
  - Prod_en
  Denied_on:
  - none
AcceptInvitation:
  description: Accepts the invitation to be monitored by a master GuardDuty account.
  links: [https://awsdocs.com]
  security_risk: should be allowed only from trusted accounts
  Allowed_on:
  - none
  Denied_on:
  - none
ArchiveFindings:
  description: Archives Amazon GuardDuty findings specified by the list of finding
    IDs.
  links: []
  security_risk: Not defined
  Allowed_on:
  - none
  Denied_on:
  - none

Structure of this file is quite self-explanatory and simplifies security review(still manual process) of the AWS APIs. During security review,  you specify which services/api are enabled/disabled and on which environments by adding environment name to the Allowed_on  and Denied_on lists. Files are stored in the git repo.

After the review, using these files as a source of truth, I (actually another python program) generate an SCP (Service Control Policy) for AWS Organization's accounts, IAM policies and permission boundaries (it depends on the case.)
Due to the very strict SCP size restrictions , generating this policy using automation allows you:

  • aggregate APIs using wildcards to reduce SCP size
  • validate API wildcards preventing unintentional service exposure/blockage
  • perform cross check for the API to avoid whitelisting/blacklisting conflicts
  • re-generate/validate SCP if AWS introduces new API calls/services
Everything mentioned above is valid not only for the SCP, but for the IAM policy/permission boundaries generation process.

This automated approach opens another possibility - automated compliance validation for AWS: using the same yaml files as a source of truth ,  perform API calls to the AWS to ensure that these calls will fail. This step could be done after deployment (to validate deployment) or on a regular basis(audit).

PS. Unfortunately code of the tools can't be open-sourced as of now.