Ok my primary goal is to find any RFC1918 address space in my Public facing DNS View. So how do I do that?
First go to Data Management – DNS – Public DNS View
Click on Export to download a csv with all the zones from your Public DNS View
Open up that csv file and copy all the zones (domains)
Go to your Linux box and create a text file and paste all those zones and save the file as zonelist.txt
Now create another file and paste the below (edit the beginning part to match your environment)
#!/bin/bash - # This script reads a file, xzonelist.txt, that contains a # list of zone files to download. A separate csv file will be # created for each zone. The csv file will be named the # same as the zone name. Minimal error checking is # performed. Use at your own risk. # All files are located in the same directory as this script. # Username and password with permission to download csv files USERNAME="admin" PASSWORD="Telec0mm" # Grid Master SERVER="gridprod.hosangit.corp" # Define file containing list of zones to export ZONELIST="xzonelist.txt" # Define file that will contain results of curl command OUTFILE="result.txt" # Location of curl on this system. Use -s so curl is silent CURL="/usr/bin/curl -s" # WAPI version VERSION="v2.3" # What view are these zone in? default maybe VIEW="External" #VIEW="default" ############################################ # No more variables to set below this line # ############################################ # Process the zonelist file one line at a time while read ZONE do echo echo echo echo echo echo "Processing zone: $ZONE" # Create CSV file for this zone $CURL \ --tlsv1 \ --insecure \ --noproxy '*' \ -u "$USERNAME:$PASSWORD" \ -H "Content-Type: application/json" \ -X POST https://$SERVER/wapi/$VERSION/fileop?_function=csv_export \ -d "{\"_object\":\"allrecords\",\"view\":\"$VIEW\",\"zone\":\"$ZONE\"}" \ > $OUTFILE ERROR_COUNT= grep -c Error $OUTFILE if [ $ERROR_COUNT -gt 0 ]; then # Display the error and skip rest of loop grep Error $OUTFILE continue fi # Get the "token" and "download URL" for later use TOKEN= grep "token" $OUTFILE | cut -d"\"" -f4 URL= grep "url" $OUTFILE | cut -d"\"" -f4 echo "Token: $TOKEN" echo "URL: $URL" # Download the CSV file echo "Download CSV file section" $CURL \ --tlsv1 \ --insecure \ --noproxy '*' \ -u "$USERNAME:$PASSWORD" \ -H "Content-Type: application/force-download" \ -O $URL # Rename CSV file so the file name matches the zone name echo "rename CSV file section" FILENAME="$ZONE.csv" # Reverse zones will contain the / character which will be interpreted # as a directory delimiter if included in file name. Replace with + FILENAME= echo $FILENAME | tr \/ + echo "Filename: $FILENAME" mv Zonechilds.csv $FILENAME # Let NIOS know download is complete echo "Let NIOS know download is complete SECTION" $CURL \ --tlsv1 \ --insecure \ --noproxy '*' \ -u "$USERNAME:$PASSWORD" \ -H "Content-Type: application/json" \ -X POST https://$SERVER/wapi/$VERSION/fileop?_function=downloadcomplete \ -d "{ \"token\": \"$TOKEN\"}" done < "$ZONELIST" exit
I save the file as extract.sh
Make the file executable by running: chmod 755 extract.sh
Now run the file: ./extract.sh
Once completed you will have domain.com.csv file for each zone that was in your zonelist.txt
Now in the same directory with all those csv files, if you wanted to find all the RFC1918 addresses you could run:
grep -E ',10\.|,172\.|192\.168\.' * | sort > /tmp/all-rfc1918-records.txt
This grep searches for multiple patterns. Since this is a comma seperated file we can search for ,10. and ,172. and ,192.168.