-
Notifications
You must be signed in to change notification settings - Fork 20
Open
Description
Attempting to export CSV of a 1M long memory capture from a Rigol "DS1152E-EDU" (which is really DS1052E with bandwidth unlocked up to 150MHz) with time column included results in duplicate time records. Looks like there's a loss of time precision going on, probably something like a rounding error:
# a couple oneliners to find duplicates in columns that I took from
# https://stackoverflow.com/questions/32084888/awk-how-do-i-find-duplicates-in-a-column
# Returns nothing
awk -F, 'a[$1]++{count++} END{print count}' normal.csv
# Again, returns nothing
awk -F, '$1 in a{print "line " NR; a[$1]; print} {a[$1]=$0}' normal.csv
# Returns 49571
awk -F, 'a[$1]++{count++} END{print count}' normal.csv
# Returns a huge list of duplicated values in time column
awk -F, '$1 in a{print "line " NR ": " a[$1]} {a[$1]=$0}' longmem.csv | tail -n 10
line 1048566: 0.0104649,-0.15624
line 1048567: 0.0104649,-0.15624
line 1048568: 0.0104649,-0.15624
line 1048569: 0.0104649,-0.15624
line 1048570: 0.0104649,-0.15624
line 1048571: 0.0104649,-0.15624
line 1048572: 0.0104649,-0.15624
line 1048573: 0.0104649,-0.15624
line 1048575: 0.010465,-0.15624
line 1048576: 0.010465,-0.15624
# And, just to be sure, it seems to be the case indeed
tail -n 10 longmem.csv
0.0104649,-0.15624
0.0104649,-0.15624
0.0104649,-0.15624
0.0104649,-0.15624
0.0104649,-0.15624
0.0104649,-0.15624
0.0104649,-0.15624
0.010465,-0.15624
0.010465,-0.15624
0.010465,-0.15624I've attached normal.csv and longmem.csv in captures.zip
I almost missed it, but I suspect that's the thing that forced me to spend an evening trying to figure out why my absolutely synchronous signals started to drift apart right in, and only in the middle of my dataset :) :

(I'm feeding those CSVs to a Python script for further manual scaling/alignment/slicing so I could visually compare them in the plot)
Metadata
Metadata
Assignees
Labels
No labels