Expert Analysis API

Base URL

All URLs referenced in this documentation have the following base:

http://octapi.tech

The REST API is served over http at the moment. https is not supported.


Authentication

The API is currently used without authentication.


Clean up the CSV file

POST

http://octapi.tech/reCSV

To return a cleaned up version of the CSV file.

Parameters

csvThe CSV to clean.String

Returns

versionA version number for the APIString
originalHeadersA list of original headers extracted from the CSV fileArray
removedHeadersA list of the headers this API has removed from the CSVArray
reCSVA version of the CSV with extraneous columns removedtext

Python

Example usage in python

    from pathlib import Path
    import requests
    import json
    #
    csv_file_path = "./myCsvFile.csv"
    csv = Path(csv_file_path).read_text()
    #
    api_endpoint = "http://www.octapi.tech/reCSV"
    data = {'csv':csv}
    r = requests.post(url=api_endpoint, data=data)
    y = eval(r.text)
    version = y["version"]
    originalHeaders = y["originalHeaders"]
    removedHeaders  = y["removedHeaders"]
    reCSV = y["reCSV"]
  

Convert JSON ZIP ARCHIVE to CSV

POST

http://octapi.tech/zip2csv

To convert a ZIP archive containing JSON data into CSV format.

Parameters

zipThe base64 encoded zip file to convert.Base64 Encoded Data

Returns

statusOK or NGString
csvData in CSV formatString
messageError message, only in the case of an errorString

Python

Example usage in python

    from pathlib import Path
    import requests
    import json
    import base64    
    #
    fileName = "./myZipFile.zip"

    with open(fileName, "rb") as f:
        bytes = f.read()
        encoded = base64.b64encode(bytes)
        api_endpoint = "http://www.octapi.tech/zip2csv"
        data = {'zip':encoded}
        theResponse = requests.post(url=api_endpoint, data=data)
        theJson = json.loads(theResponse.text)

    theCsv = theJson["csv"]

    #If you want to write out the CSV, 
    f = open(directory_name+'csvFile.csv', 'w')
    f.write(theCsv)
    f.close()    
  

Convert CSV to JSON

POST

http://octapi.tech/objectify

To convert a CSV file into JSON format.

Parameters

csvThe CSV to convert.String

Returns

versionA version number for the APIString
headersA list of headers extracted from the CSV fileArray
dataSetThe data set from the CSV file, structured as a JSON objectObject
topLinestThe top lines in the CSV file, before the data, as a listArray
summaryDataThe summary data from the bottom of the CSV fileArray of objects

An example of the returned object. The dataSet is, itself, an array of objects. There is an element of the array for each row in the CSV file. Each element of the array is an object, where the key is the column heading, and the value is item in that column and row. The "headers" attribute lists all of the column headings. (Note: Firefox is quite useful for viewing JSON files. Chrome can be as well, but you may need to install JSONview.

Python

Example usage in python

    from pathlib import Path
    import requests
    import json
    #
    csv_file_path = "./myCsvFile.csv"
    csv = Path(csv_file_path).read_text()
    #
    api_endpoint = "http://www.octapi.tech/objectify"
    data = {'csv':csv}
    r = requests.post(url=api_endpoint, data=data)
    y = eval(r.text)
    version = y["version"]
    headers = y["headers"]
    dataSet = y["dataSet"]
  

Convert CSV to CSV

POST

http://octapi.tech/csv2csv

To convert a CSV file into CSV format.

Parameters

csvThe CSV to convert.String

Returns

versionA version number for the APIString
csvA cleaned up version of the CSV file provided as inputText

The purpose of this routine is to provide a common format for any number of provided CSV files. The input is run through OBJECTIFY, and then a new CSV is created from that object. Since all data goes through the same filter (objectify), it should be relatively consistent when it comes back out.

Python

Example usage in python

    from pathlib import Path
    import requests
    import json
    #
    csv_file_path = "./myCsvFile.csv"
    csv = Path(csv_file_path).read_text()
    #
    api_endpoint = "http://www.octapi.tech/csv2csv"
    data = {'csv':csv}
    r = requests.post(url=api_endpoint, data=data)
    y = eval(r.text)
    version = y["version"]
    newCSV = y["csv"]
  

Analyze Time Series Data

POST

http://octapi.tech/crunch

Analyze the time series data and turn it into statistics. Creates statistical values (e.g. mean, max, min, etc.) for the various measured parameters (e.g. throughput, streams, MCS, etc.) for each value of attenuation.

Parameters

headersThe set of headers (see objectify)Array
dataSetThe data set (see objectify)Object or String
runThe run number to analyzeNumber

Returns

versionA version number for the APIString
crunchedDataThe statistical data for the data set, structured as a JSON objectObject
NotesAlerts, Warnings, Commentary, and Summary based on analysis of the dataObject

An example of a simple RvR set of data with a single Pal Endpoint.
An example of a straight throughput test with eight (8) Pal Endpoints.

crunchedData is a fairly complicated object. Viewing the examples should help to make sense, but here is a summary of the structure.


(Note: Firefox is quite useful for viewing JSON files. Chrome can be as well, but you may need to install JSONview.

Python

Example usage in python

    api_endpoint = "http://www.octapi.tech/crunch"
    run=1;
    data = {'headers':headers, 'dataSet':json.dumps(dataSet), 'run':run}
    r = requests.post(url=api_endpoint, data=data)
    y = json.loads(r.text)
    version = y["version"]
    analyzedData = y["crunchedData"]
    print("The number of sets of analyzed data is {}".format(len(analyzedData)))
    for a in range(0,len(analyzedData)):
    print("The number of attenuation steps in set {} is {}".format(a+1,len(analyzedData[a])))
  

Analyze Statistical Data

POST

http://octapi.tech/analyze

Analyze the statistical data. For example, provide feedback on the "quality" of a rate vs. range curve. In addition, compare the results of some statistical tests to pass/fail criteria, if those criteria are supplied.

Parameters

headersThe set of headers (see objectify)Array
dataSetThe data set (see objectify)Object or String
runThe run number to analyzeNumber
crunchedDataThe statistical data (see crunch)Object or String
peThe Pal Endpoint to analyzeNumber
LimitsCSVThe CSV containing the limits information, if limits checking is desired String (Example File)
aNameThe name of the analysis being calledString

Returns

versionA version number for the APIString
analyzedDataThe statistical data for the data set, structured as a JSON object (same as input)Object
NotesNotes based on analysis of the data. Includes Alerts, Warnings, and NotesArray
limitTestResultsResults of limit test analysis, if limits file was providedArray

Python

Example usage in python

    limits_file_path = "./Limits.csv"
    limits = Path(limits_file_path).read_text()
    #
    api_endpoint = "http://www.octapi.tech/analyze"
    pe = 1
    data = {'headers':headers, 'dataSet':json.dumps(dataSet), 'run':run, 'crunchedData':json.dumps(analyzedData),
            'pe':pe,'LimitsCSV':limits,'aName':'analyze_ThroughputVsAttenuation'}
    r = requests.post(url=api_endpoint, data=data)
    y = json.loads(r.text)
    print("Version : {}".format(y["version"]))
    print("Analyzed data length : {}".format(len(y["analyzedData"])))
    print("Alerts length : {}".format(len(y["Notes"]["Alerts"])))
    print("Warnings length : {}".format(len(y["Notes"]["Warnings"])))
    print("Limits test length : {}".format(len(y["limitTestResults"])))
    limitTest = y["limitTestResults"]
    print("Limit test results: {}".format(json.dumps(limitTest,indent=4)))
  

Analyze Statistical Data (Alternate Usage)

POST

http://octapi.tech/analyze

Analyze the statistical data. For example, provide feedback on the "quality" of a rate vs. range curve. In addition, compare the results of some statistical tests to pass/fail criteria, if those criteria are supplied.
This test can also be supplied with the original CSV file instead of the intermediate statistical files.

Parameters

DataCSVThe CSV data to useArray
LimitsCSVThe CSV containing the limits information, if limits checking is desiredString
aNameThe name of the analysis being calledString

Returns

versionA version number for the APIString
analyzedDataThe statistical data for the data set, structured as a JSON object (same as input)Object
NotesNotes based on analysis of the data. Includes Alerts, Warnings, and NotesArray
limitTestResultsResults of limit test analysis, if limits file was providedArray

Python

Example usage in python

    api_endpoint = "http://octapi.tech/analyze"
    data = {'DataCSV':csv, 'LimitsCSV':limits,'aName':'analyze_ThroughputVsAttenuation'}
    r = requests.post(url=api_endpoint, data=data)
    y = json.loads(r.text)
    print("Version : {}".format(y["version"]))
    print("Analyzed data length : {}".format(len(y["analyzedData"])))
    print("Alerts length : {}".format(len(y["Notes"]["Alerts"])))
    print("Warnings length : {}".format(len(y["Notes"]["Warnings"])))
    print("Limits test length : {}".format(len(y["limitTestResults"])))
    limitTest = y["limitTestResults"]
    print("Limit test results: {}".format(json.dumps(limitTest,indent=4)))
  

Get theoretical MCS

GET

http://octapi.tech/getMCS

Get MCS based on the 802.11 standard.

Parameters

standard802.11 standard: 11n, 11ac, 11axString
RSSIReceived signal strength (dBm)String
BWBandwidth (MHz): 20, 40, 80, 160String
streamsNumber of streams: 1-8String

Returns

statusStatus of the returnString
MCSMCS valueNumber

Python

Example usage in python

    api_endpoint = "http://www.octapi.tech/getMCS"
    standard = "11ax"
    RSSI = "-50"
    BW = "80"
    streams = "2"
    api_endpoint += "?" + "standard=" + standard
    api_endpoint += "&" + "RSSI=" + RSSI
    api_endpoint += "&" + "BW=" + BW
    api_endpoint += "&" + "streams=" + streams
    r = requests.get(url=api_endpoint)
    y = eval(r.text)
    status = y["status"]
    MCS    = y["MCS"]
    print("Returned with status {}".format(status))
    print("MCS found is {}".format(MCS))
  

Get theoretical data rate

GET

http://octapi.tech/getDRate

Get data rate based on the 802.11 standard.

Parameters

standard802.11 standard: 11n, 11ac, 11axString
MCSMCS (see /getMCS)String
BWBandwidth (MHz): 20, 40, 80, 160String
streamsNumber of streams: 1-8String
GIGuard interval (ns). Depends on standard, but usually 400, 800, 1600, 3200String
tonesNumber of tones for OFDMA. Optional, and only valid for 11ax. Should be an allowed number of tones.String

Returns

statusStatus of the returnString
rateData rateNumber

Python

Example usage in python

    api_endpoint = "http://www.octapi.tech/getDRate"
    standard = "11ax"
    MCS = "7"
    BW = "80"
    streams = "2"
    GI = "800"
    api_endpoint += "?" + "standard=" + standard
    api_endpoint += "&" + "MCS=" + MCS
    api_endpoint += "&" + "BW=" + BW
    api_endpoint += "&" + "streams=" + streams
    api_endpoint += "&" + "GI=" + GI
    print("The api endpoint is " + api_endpoint)
    r = requests.get(url=api_endpoint)
    y = eval(r.text)
    status = y["status"]
    rate    = y["rate"]
    print("Returned with status {}".format(status))
    print("Rate found is {}".format(rate))
  

PCAP APIs

Although these can be run against the hosted version of the API, uploading PCAP files takes time, and the hosted server may not be as fast as local servers. Therefore, the descriptions of the PCAP APIs will assume a local version of Expert Analysis, running on port 8080.

The basic PCAP parser

GET

http://localhost:8080/parsePCAP

Parse a PCAP file. The parsing (dissection) engine can either be Expert Analysis "native", or it can use TSHARK. Over time we expect TSHARK to be the preferred method.

Parameters

fileNameName of the PCAP file. Note that for EA native dissection this must be a .pcap file. For TSHARK, any format (pcap, pcapng, gz) can be used.String
pcapDirDirectory location for the file. If the file is in the default (pcapDirectory), this can be ignored. Otherwise, use a full path.String
preferTsharkIndicate that tshark should be used for packet dissection, if possible. The current default is FALSE, but this is expected to change.String, "true" or "false"
filtersPacket Types on which to filter. Exactly the same as on the UI.
  • Association request
  • Association response
  • Reassociation request
  • Reassociation response
  • Probe request
  • Probe response
  • Beacon<
  • Announcement traffic indication map (ATIM)
  • Disassociate
  • Authentication
  • Deauthentication
  • Action frames
  • Block ACK Request
  • Block ACK
  • Power-Save Poll
  • Request to Send
  • Clear to Send
  • ACK
  • Contention Free Period End
  • Contention Free Period End ACK
  • Data
  • Data + Contention Free ACK
  • Data + Contention Free Poll
  • Data + Contention Free ACK + Contention Free Poll
  • NULL Data
  • NULL Data + Contention Free ACK
  • NULL Data + Contention Free Poll
  • NULL Data + Contention Free ACK + Contention Free Poll
  • QoS Data
  • QoS Data + Contention Free ACK
  • QoS Data + Contention Free Poll
  • QoS Data + Contention Free ACK + Contention Free Poll
  • NULL QoS Data
  • NULL QoS Data + Contention Free Poll
  • NULL QoS Data + Contention Free ACK + Contention Free Poll
  • EAPOL
  • Malformed Packet
  • FromDS
  • ToDS
  • TCP
  • IGMP
  • Trigger
  • VHT/HE NDP Announcement
String
categoriesPacket Categories on which to filter. Exactly the same as on the UI.
  • Spectrum Management
  • QoS
  • DLS
  • Block ACK
  • Public
  • Radio Measurement
  • Fast BSS Transition
  • HT
  • SA Query
  • Protected Dual of Public Action
  • WNM
  • Unprotected WNM
  • TDLS
  • Mesh
  • Multihop
  • Self-protected
  • DMG
  • Wi-FI Alliance
  • Fast Session Transfer
  • Robust AV Streaming
  • Unprotected DMG
  • VHT
  • TWT
  • HE
  • Protected HE
String
actionsPacket Action Categories on which to filter. Exactly the same as on the UI.
  • For cateegory "WNM"
    • Event Request
    • Event Report
    • Diagnostic Request
    • Diagnostic Report
    • Location Configuration Request
    • Location Configuration Response
    • BSS Transition Management Query
    • BSS Transition Management Request
    • BSS Transition Management Response
    • FMS Request
    • FMS Response
    • Collocated Interference Request
    • Collocated Interference Report
    • TFS Request
    • TFS Response
    • TFS Notify
    • WNM Sleep Mode Request
    • WNM Sleep Mode Response
    • TIM Broadcast Request
    • TIM Broadcast Response
    • QoS Traffic Capability Update
    • Channel Usage Request
    • Channel Usage Response
    • DMS Request
    • DMS Response
    • Timing Measurement Request
    • WNM Notification Request
    • WNM Notification Response
    • WNM-Notify Response
  • For category "Fast Session Transfer"
    • FST Setup Request
    • FST Setup Response
    • FST Teardown
    • FST Ack Request
    • FST Ack Response
    • On-channel Tunnel Request
  • For cateogry"TWT"
    • TWT Setup
    • TWT Teardown
String
ToMacsA comma-separated list of MAC addresses that should be in the Transmitter, or Source, address of the packet.String
FromMacsA comma-separated list of MAC addresses that should be in the Receiver, or Destination, address of the packet.String
LogicThis should be "AND" or "OR". It is used with the To/From addresses. The packets should either have packets To "AND" From the addresses, or To "OR" From the addresses.String
startAtThe first packet number to use in the analysis.String
endAtThe last packet number to use in the analysis.String

Returns

uploadFilesThe list of files in the default PCAP directory.Array of strings
erroredIndicates return error conditionBoolean
errorMessageMessage in error caseString
maxPacketProcessedThe number of the last packet analyzedInteger
packetsProcessedThe total number of packets analyzedInteger
moreIndicates if there is more analysis to be done. This API handles a maximum of 20K packets per call. If more are requested, the analysis will analyze the first 20K, and set more to TrueBoolean
timeMsecTotal analysis time, and time per packet.Object, {total, perPacket}
fieldsThis is the list of returned packets. This is an array of objects, and the object will be different for each packet type. Click here to see an example for a Trigger.Array of Objects.
firstTimestampThe first timestamp in the analyzed data.Integer
firstFrameTimestampThe first frame timestamp in the analyzed data.Integer
firstArrivalTimeThe arrival time of the first frame in the analyzed data.Float
channelSwitchesChannel switch announcement information from beaconsArray
bssTransitionsDEPRECATEDEmpty Array
roamThisMACMAC address for the roaming station in a roaming analysis. String

Python

Example usage in python

    
    api_endpoint = "http://localhost:8080/" + "parsePCAP?fileName="+pcapFile+"&pcapDir="+pcapDir
    apiResponse = requests.get(api_endpoint, timeout=REST_TIMEOUT)
    returnObject = json.loads(apiResponse.text)

    totalPackets = returnObject["totalPackets"]
    thePackets = returnObject["fields"]
		
  

The PCAP API toolkit

GET

http://localhost:8080/pcapAPI

These API calls make use of the PCAP parser, above, but they go a step further. There are some built-in analyses of the PCAP files that can be useful in further analysis.

Parameters

fileNameName of the PCAP file. Note that for EA native dissection this must be a .pcap file. For TSHARK, any format (pcap, pcapng, gz) can be used.String (can be CSV for MLO)
pcapDirDirectory location for the file. If the file is in the default (pcapDirectory), this can be ignored. Otherwise, use a full path.String (can be CSV for MLO)
preferTsharkIndicate that tshark should be used for packet dissection, if possible. The current default is FALSE, but this is expected to change.String, "true" or "false"
noDuplicatesIndicate that the parser should attempt to remove all duplicate packets in the capture. By default this is false. This should only be used if there are multiple sniffers that are expected to be capturing the same packets, and you do not want duplicates to be analyzed.String, "true" or "false"
filtersPacket Types on which to filter. Exactly the same as on the UI. (see above)String
categoriesPacket Categories on which to filter. Exactly the same as on the UI. (see above)String
actionsPacket Action Categories on which to filter. Exactly the same as on the UI. (see above)String
toA comma-separated list of MAC addresses that should be in the Transmitter, or Source, address of the packet.String
fromA comma-separated list of MAC addresses that should be in the Receiver, or Destination, address of the packet.String
logicThis should be "AND" or "OR". It is used with the To/From addresses. The packets should either have packets To "AND" From the addresses, or To "OR" From the addresses.String
firstPacketThe first packet number to use in the analysis.String
lastPacketThe last packet number to use in the analysis.String
directionThe direction of the traffic flow, if that information is needed for the analysis.String, either "dl" or "ul".
MLObandsA set of bands that match the input filesComma separated list
actionWhich of the available "tools" to select. The list follows.
  1. getTotalPacketCount, counts all packets in the file. example output
  2. getPacketList, returns the complete dissection for packets that satisfy the various filters. example output
  3. getAIDlist, returns AID lists (per BSSID) for captures with Trigger frames. example output
  4. getAID2MAC, returns an AID to MAC address mapping for captures with Trigger frames, if the mapping can be found. example output
  5. getTriggerInfo, returns Trigger statistics for captures with Trigger frames. example output
  6. getHeMuInfo, returns HE-MU statistics for captures with HE-MU frames. PPDU=2 example output
  7. getMainStream, attempts to find the main flow of data in the capture. From MAC, to MAC. example output
  8. getDFS, returns channel switch announcement information if it exists in the beacons. example output
  9. getSequenceNumberDeltas, uses the "main stream" found, and looks for gaps in sequence numbers. Useful for seeing if packets are being dropped/missed. example output
  10. getStats, uses the "main stream" found, and returns some basic statistics for each packet. example output
  11. getRoaming, looks for evidence of a roam event and returns roam time calculations. example output
  12. getDataOfType, uses the filters, pulls packets that pass the filters but returns a much smaller amount of data than getPacketList example output
  13. getMLO, requires multiple data files, and uses various packets to figure out the MLDs on the AP and the STA side. MLObands must be passed in for this function.
String
roamAnalysesWhich of the roam time analysis methods to use, when the action=getRoaming is selected. The list follows.
  • AllData: Using any data packets, calculate a roam time.
  • ULpackets: Using only UPLINK packets from the roaming station, calculate a roam time.
  • TCP-UDP: Focus on only TCP or UDP packets, calculate a roam time.
  • Cisco: Very similar to AllData, but require that the packets be ACKed.
  • BlockACKs: Define roam time as being between Block ACKs before and after a BSSID change of the roaming station.
String

Returns

Varies, depending on the analysis performed. See specific examples above, and in here.

Python

Example usage in python


    baseURL = "http://localhost:8080/pcapAPI"
    api_endpoint = baseURL + "?fileName="+pcapFile+"&pcapDir="+pcapDir+"&firstPacket=1&lastPacket=1000&action=getAIDlist"

    print(">>>Running the captured file through a PCAP analysis.  Get the AID list.")
    apiResponse = requests.get(api_endpoint, timeout=REST_TIMEOUT)
    returnObject = json.loads(apiResponse.text)
    apiListResult = returnObject["result"]
    print('>> aidsPerBssid: ', apiListResult["aidsPerBssid"])    
		
  

DEPRECATED