Jump to content

Searching a csv file for the largest value in a column


Br@d

Recommended Posts

I do not know if is possible <Uber Noob Here> but I'm trying to automate the capture of open WiFi traffic to be used with a headless Raspberry Pi or possible WiFi Pineapple. 

What I have so far is a basic Bash script (which a plan to run on startup) that set the wlan0 into monitor mode. Then does a 30 second capture of airodump-ng and writes the results to a csv file. From there I can use grep to fine only the lines that apply of a bssid with open authentication. 

What I want to do next, and the part that I currently have issues with is to find the bssid (row) with the highest value for iv's (traffic) and out put the  value to it's channel column.

From there I plan to restart airodump-ng to capture traffic on that defined channel and write it to a pcap file.

Any suggestions on how to accomplish this next step? or am I going about this all wrong?

Link to comment
Share on other sites

Not sure if this will help but hopefully give some ideas on how to go about it from the command line.

http://unix.stackexchange.com/questions/170204/find-the-max-value-of-column-1-and-print-respective-record-from-column-2-from-fi

 

Sorting first and the order of your columns might help too, if you format the first item as the IV number count to get the first row, if you do a descending sort.

There may even be a way to sort and search columns of a CSV directly, but I don't know of any specific programs/commands that do this. Other thought is putting it in a small DB file and using a SQL query to do the work for you(I am by no means a SQL person though), but might make sense to keep them in a DB file for later and adding in info later, displaying in a web page with some PHP, etc.

Link to comment
Share on other sites

commands like awk, sed adn cut can be combined, but they all(as far as I know) also have regex capabilities to exclude certain characters or only allow specifics like letters and numbers, etc. Regex is not my strong suit, but you can test with online sites like http://regexr.com/ which I often use to help ensure I'm grabbing what I want. paste in a line of text, and play with the regex to get the strings you want.

Link to comment
Share on other sites

If you're combining awk, sed and cut together and you're planning on using the code in the long term then you'd probably want to be considering moving over to a Perl or Python script instead for that part, as it will be easier to maintain.

I'm not sure on the exact layout of your CSV file, but something like the following might produce the result you're after (you'll probably have to tune the field numbers being used to extract the $bssid and $ivs variables).

#!/usr/bin/perl

use strict;
use warnings;

my $maxIVs;
my $maxBSSID;

while (my $line=<STDIN>) {
    my @field = split /\s*,\s*/, $line;

    my $bssid = $field[0];
    my $ivs = $field[10];

    if (!$maxIVs || $ivs > $maxIVs) {
        $maxBSSID = $bssid;
        $maxIVs = $ivs;
    }
}

print "$maxBSSID";

If you make the script executable then you can use it in a bash script to populate a variable (you would probably want to change the path to match wherever you put the script).

TARGET_BSSID=`./maxIV < test.csv`

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...