Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Python/Grafana Coding
#31
It could be the garbage collection on the sender is kicking in part way through the send.... garbage collection would then potentially interrupt the serial soft transfer to the hardware buffer before it goes out the line. Hardware send runs out of data, waits for new data to be shuffled in...

Influx / any database - main issue for capping around 100 is usually transaction logging process. The batching process bundles the commit transaction write so you only write once and not 100 times.
Reply
#32
Yeah, these issues with the slowness isn't even in the influxdb transaction. this is before that.

I'm trying some other things and see what happens.
Proceed with caution. Knowledge is Power! Literally! Cool 
Knowledge is Power; Absolute Knowledge is Absolutely Shocking!
Certified 18650 Cell Reclamation Technician

Please come join in general chit-chat and randomness at https://discord.gg/c7gJ5uA
(this chat is not directly affiliated with SecondLifeStorage)
Reply
#33
Korishan: Can you post the loop that works without slowness and the loop that is slow?
The Ultimate DIY Solar and build place
YouTube / Forum system setup / My webpage  Diy Tech & Repairs

Current: 10kW Mpp Hybrid | 4kW PIP4048 | 2x PCM60x | 83kWh LiFePo4 | 10kWh 14s 18650 |  66*260W Poly
Upcoming: 14S 18650~30kWh | Automatic trip breakers, and alot more
Reply
#34
Just a quick update. I've been doing some changes (hardware and software). I had to move my networking setup as I wanted the laptop to be hardwired to the network instead of using the wifi. That was causing a different issue.

Software wise, I've verified that python 2 and 3 are installed. Altho, seems that it doesn't matter which one I use, the results are basically the same. Other than from some syntax restrictions, I didn't notice any difference.

However, after the changes I've done (not sure which fixed this), I now have the serial.read() method. Before when I tried it, I got the "This method doesn't exist" error. I'm wondering if installing pyserial on the python3 side of things fixed this.
But, here is the test code I have cobbled together so far:
Code:
cntr = 1
max = 500000
bRun = True
buf = ""
start = time.time()
prev = start
while bRun:
    char = ser.read()
    if len(char) > 0:
       if (char != '\n'):
          buf += char
       else:
          run = time.time()
          ms = (run - prev) * 1000
          prev = run
          print(str(ms) + ":" + buf)
          buf = ""
    if cntr == max:
       bRun = False
       end = time.time()
       ms = (end - start) * 1000
       print("Exiting. It took:", ms)
    cntr += 1

And I get this:
Code:
1724.40314293:0, 266, 120.65, 3.96, 477.94, 10.24, 1235.49, 1713.44
218.006849289:1, 484, 120.57, 3.94, 475.55, 10.21, 1230.62, 1706.17
232.012987137:2, 717, 119.86, 4.08, 488.75, 10.34, 1239.78, 1728.53
246.991157532:3, 965, 120.62, 3.98, 480.46, 10.92, 1317.82, 1798.28
363.041877747:4, 1329, 120.57, 4.06, 489.20, 10.22, 1232.72, 1721.91
241.02306366:5, 1570, 120.75, 4.03, 486.75, 10.12, 1222.14, 1708.89
302.004098892:6, 1874, 120.61, 4.04, 486.81, 10.39, 1252.79, 1739.60
331.015825272:7, 2206, 119.12, 3.96, 471.33, 10.38, 1235.94, 1707.27
('Exiting. It took:', 3773.9031314849854)

So it grabs 10 lines in about 4 seconds. The Ard code sends every 1/8 second (125ms)

Code:
void loop(){
  unsigned long currentTimer = millis();
  String buf;

  if (currentTimer - lastSerialUpdate > refreshSerial)
  {
    buf = String(counter) + ", " +
          String(currentTimer) + ", " +
          String(voltageLine1) + ", " +
          String(currentLine1) + ", " +
          String(currentLine1 * voltageLine1) + ", " +
          String(currentLine2) + ", " +
          String(currentLine2 * voltageLine1) + ", " +
          String((currentLine1 * voltageLine1) + (currentLine2 * voltageLine1));
    Serial.println(buf);
    lastSerialUpdate = currentTimer;
  }
}


I'm still tinkering with it.  I'll redo the other two versions of the code, readline() and readlines(), tomorrow. Wee late and I'm struggling to keep my eyes open.
Thanks for the help, guys. I'm trying to understand a new coding language and data interface.
Proceed with caution. Knowledge is Power! Literally! Cool 
Knowledge is Power; Absolute Knowledge is Absolutely Shocking!
Certified 18650 Cell Reclamation Technician

Please come join in general chit-chat and randomness at https://discord.gg/c7gJ5uA
(this chat is not directly affiliated with SecondLifeStorage)
Reply
#35
The line :
buf += char

This will take the existing string "buf" allocate new memory of length buf + char, copy over buf and char into the new memory space and then deallocate the old buf memory. This will work ok for small strings, but the processign time on longer strings (and garbage collection / memory) can become an issue as the strings grow. Processing time increases each byte/character that is added.

Another option would be to create an array buf() with a maximum size of say 100 characters and then copy across the one character into the buffer position as it arrives with a separate byte counter. This may be needed if you are after incresing the transfer rate a lot...

From the first value timing the initial long delay is object creation and initialisation of the code, subsequent 3 loops are reasonable so the buf() array is not needed as timing wise the system can cope. The 363 may be when the garbage collector has a run to clear up the buf copy memory, henace the extra 100mS. same again for the 331ms.

Let it run for around 50 lines and see the timing pattern...

The first time ser.read() is called may be the majority of the setup delay... initial opening and setup of the serial port. This should be just a one time hit at startup.
Korishan likes this post
Reply
#36
Thanks, I'll make some more changes to the code as you mention to an array and copy the values over differently.
When I watch the screen, I can visually notice a longer delay. I'm not sure if that corresponds with the 300+ runs, or is just due to the network connections. It all happens so fast, I can't quite tell which line I notice it on Tongue

I hadn't noticed the 300+ ms pattern before. But I definitely see it now. Thanks for bringing that up. Still learning code and it's nuances.
Proceed with caution. Knowledge is Power! Literally! Cool 
Knowledge is Power; Absolute Knowledge is Absolutely Shocking!
Certified 18650 Cell Reclamation Technician

Please come join in general chit-chat and randomness at https://discord.gg/c7gJ5uA
(this chat is not directly affiliated with SecondLifeStorage)
Reply
#37
It's the small details that in aggregate create the problems. A lot of systems these days are written without scalability and then wonder why with just a 10x increse the system fails and needs to be re-written. High level coding has taken away a lot of understanding in the underlying mechanics and issues that can arrise.

The network objects will have a lot more overhead and they will be larger so if they are being created and thrown regularly the garbage collector may be the ultimate issue to deal with, i.e. not creating and throwing objects away in code so do not give the garbage collector work to do.. also if the calls are a blocking type this may also allow the serial hardware buffer to overflow on hgih bit rates, i.e. brief wifi connection retry cycle.. which I'm guessing is what you may have seen and why wired is not showing problems.

For the GPS task I had to deal with I ended up with 2 microcontrollers connected together (suppose it was early hardware parallel threading) and one controller just dealt with handling the timing and the second one was just handling the serial forwarding. This could be a route to get a very high sample rate and consistency if the other processing (garbage/network) create timing issues, but not for now.... just code and learn.
Korishan likes this post
Reply
#38
I tried doing the array, and that's not gonna really work. For one, everything is in strings, and I'm looking for numbers. So that makes reading things a little more difficult.

Also came across an interesting find. If I do pyserial reads to fast, it gets confused some how. That's what's splitting the lines. It's not the buffer flow or anything.

I did this quick code:
Code:
while cntr < 1000000:
   time.sleep(.125)
   lines = ser.readlines()
   for line in lines:
      if len(line) > 0:
         line = line.rstrip("\n")
         print(line)
   cntr += 1

If I leave the time.sleep(), the output lines look normal:
Code:
0, 256, 120.55, 3.92, 472.89, 2.51, 302.58, 775.47
1, 468, 120.55, 4.01, 483.68, 2.68, 323.48, 807.16
2, 697, 119.48, 3.94, 470.47, 2.64, 315.90, 786.37
3, 969, 121.17, 4.00, 484.27, 2.56, 310.36, 794.64
4, 1198, 120.01, 3.93, 471.12, 2.66, 319.58, 790.70
5, 1438, 120.74, 3.94, 475.89, 2.71, 327.71, 803.60
6, 1641, 119.47, 3.93, 469.77, 2.70, 322.46, 792.23
Most of the time, anyways. Previous runs I didn't run into any oddities. This time when I ran it, I had 2 split lines.

Now, if I remove the sleep() completely, I get this:
Code:
0, 253, 121.51, 3.98, 484.22, 49
.48, 6012.62, 6496.84
1, 448, 119.83, 4.09, 490.56, 9.
68, 1159.85, 1650.42
2, 653, 119.77, 4.07, 487.80, 9.
41, 1126.51, 1614.31
3, 901, 120.76, 4.13, 499.29, 9.
89, 1194.18, 1693.47
4, 1229, 119.62, 4.04, 482.97, 9
.69, 1159.11, 1642.07
5, 1470, 120.15, 4.06, 487.73, 9
.82, 1180.29, 1668.02
6, 1659, 121.76, 4.10, 499.29, 1
0.88, 1324.68, 1823.98
7, 1901, 119.13, 3.97, 473.46, 1
0.44, 1244.15, 1717.62
8, 2194, 120.68, 4.06, 489.56, 1
0.35, 1249.59, 1739.15
9, 2414, 119.51, 4.05, 484.13, 1
0.46, 1250.44, 1734.56
10, 2702, 119.55, 4.01, 478.96,
10.52, 1257.45, 1736.40
11, 2917, 120.02, 4.07, 488.08,
10.46, 1255.43, 1743.50
Every line is split. And I noticed the split moves. The first read the split occurs between the "9" and the "." Lines 1 - 3 the split is "." and next number. Lines 4 - 9, the split is after the # and "." .
The split is slowly moving up the string.

Now, if I use python serial tools miniterm, it splits the data out correctly with no errors at all; none, ziltch, nada, zip.

Ok, if I change the sleep() to 500ms, the error rate drops to almost nothing. But, one thing I did find doing these tests, is that I now am reading sequentially increasing line entries.
If you noticed in my previous posts, the linenumber would sometimes increment by 2 or 3, which is missing data. Now, I'm at least getting all those entries.
I let it run for about 500 entries. I got 5 errors (split lines), accounting for those lines, I got exactly the same number of sequential entries as I do read lines.

So, for all that's worth, I'm calling this part a success. Smile
I'm still wondering why it splits the lines, though. That makes no sense. I'll post the question on the python github issues and see what they say.
Proceed with caution. Knowledge is Power! Literally! Cool 
Knowledge is Power; Absolute Knowledge is Absolutely Shocking!
Certified 18650 Cell Reclamation Technician

Please come join in general chit-chat and randomness at https://discord.gg/c7gJ5uA
(this chat is not directly affiliated with SecondLifeStorage)
Reply
#39
Within the data I am now assuming that the second timing value is the value of the transmission point. With the variation in send times you may be looking at the problem from more that one angle.

Second set of data between 2 3 and 4 the timing varies between around 250ms and 330ms, which is more than one itteration of sleep so each loop will normally capture one line, unless the code executes at the same time of the serial arrival, in which case it will split the line and this split may show up as random or a similar pattern to the second set of data (this will also show and explain some of the internal serial process timing).

The second set of output data is splitting the data possibly based on code loop execution time and data arrival time in the serial buffer. i.e. One loop itteration takes the same time as it does for the serail line to recieve X bytes of data, next loop the data is read, following loop the last bytes are read. Variation in reads of a few bytes would be consistent with this delay and slight variation in bytes.

In order to get to the 800 rate you will have to send the data as a custom packet of bytes (two bytes for ID, two bytes for volts, etc.) then translate/read these as unsigned integers (high/low byte) or larger byte numbers and then away from the serial read/write processing convert them to text if being displayed / stored.
Reply
#40
I cant really understand that you get so much errors?  

If you always on recieving data on the host like this:;

while data exist d
  buff = buff + data;
   if buff have end char do
        process the line
        set buff emptu
    done
loop

Then you never ever process data unless you got all data for a line. You also should not incorporate any delays when reading the data. You should not have to do any manual timing since that is built in and you should utilize the buffer instead.

Or is it just me that havent had those issues?  I have atleast done 50-100 values per second without your issues Tongue

Edit: With readline it will wait for EOL before going to next step. So if it doesnt wait for it you most likely have buffer or other problems. 
or 1 byte at a time I do like this:
Code:
line = []

while True:
   for c in ser.read():
       line.append(c)
       if c == '\n':
           print("Line: " + ''.join(line))
           line = []
           break
The Ultimate DIY Solar and build place
YouTube / Forum system setup / My webpage  Diy Tech & Repairs

Current: 10kW Mpp Hybrid | 4kW PIP4048 | 2x PCM60x | 83kWh LiFePo4 | 10kWh 14s 18650 |  66*260W Poly
Upcoming: 14S 18650~30kWh | Automatic trip breakers, and alot more
Reply


Who read this thread?
48 User(s) read this thread:
raccooon (11-28-2018, 11:28 PM), Riplash (10-14-2018, 06:11 PM), Walde (10-07-2018, 05:32 PM), Bubba (10-14-2018, 06:50 PM), duwdu (11-10-2018, 05:29 AM), SilverNodashi (09-30-2018, 10:35 AM), iomagico (10-13-2018, 06:41 PM), Ibiza (10-04-2018, 06:34 PM), completelycharged (10-08-2018, 09:16 PM), Redpacket (10-13-2018, 01:12 PM), gpn (09-28-2018, 06:32 PM), w0067814 (09-30-2018, 03:08 PM), michaell (10-02-2018, 06:40 PM), cmmurphyj (11-08-2018, 02:52 PM), 5buBZMKeJZgapTGsbGzKf (10-08-2018, 01:58 PM), talaldki (09-30-2018, 06:01 PM), Wolf (11-21-2018, 04:21 PM), Carl (12-03-2018, 12:39 AM), Beholder (10-08-2018, 07:18 PM), lordxenu (10-12-2018, 04:53 PM), emuland-metroman (10-07-2018, 11:17 AM), chuckp (10-07-2018, 04:28 PM), yoeri_w (09-29-2018, 10:04 AM), rtgunner (10-07-2018, 04:32 PM), brwainer (10-14-2018, 10:06 AM), wim (10-14-2018, 06:14 PM), jdeadman (10-15-2018, 12:10 AM), Mike C (10-01-2018, 09:31 PM), Frnandu Martiński (10-04-2018, 01:38 PM), Franky Beuselinck (10-01-2018, 01:12 PM), PAF (10-03-2018, 01:41 PM), Korishan (10-14-2018, 06:21 PM), Majorphill (10-12-2018, 10:13 PM), mike (10-15-2018, 07:38 PM), Sean (10-14-2018, 04:54 PM), frnandu (09-29-2018, 07:54 PM), watts-on (10-14-2018, 06:27 PM), CarelHassink (10-06-2018, 09:51 PM), not2bme (09-29-2018, 12:40 PM), ChrisD5710 (10-12-2018, 04:31 PM), floydR (10-07-2018, 01:02 AM), Geek (10-12-2018, 03:27 AM), Mazlem (09-30-2018, 01:41 AM), tremors (10-09-2018, 09:48 PM), tamkov (11-04-2018, 09:01 PM), Hanssing (10-07-2018, 03:28 PM), jesusangel (10-14-2018, 05:26 PM), daromer (10-23-2018, 07:03 PM)

Forum Jump:


Users browsing this thread: 2 Guest(s)