Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Python/Grafana Coding
#21
I still dont have time to help out here and solve it so i suggest:

1. Lower the number ot sendouts of data on the Arduino. Set it to like 2x per second
2. On the reciever make sure you actually do recive 1 line at a time before processing.
3. (Overkill but its always good practive) Add start-byte and also make sure you got a crc code included. Then you can easily make sure that the data sent is same as data recieved. Samt as all inverters and others do.

If you mismatch so that you send to much and filling the buffer you will get issues.
The Ultimate DIY Solar and build place
YouTube / Forum system setup / My webpage  Diy Tech & Repairs

Current: 10kW Mpp Hybrid | 4kW PIP4048 | 2x PCM60x | 83kWh LiFePo4 | 10kWh 14s 18650 |  66*260W Poly
Upcoming: 14S 18650~30kWh | Automatic trip breakers, and alot more
Reply
#22
With software there is always many ways to code many solutions, they all just vary on how long they take to execute... if execution time is not critical then any solution that works is ok.

Option 1 :
Read byte by byte from serial into a string until you get a line feed / cr and then split / parse the string to process

Option 2 :
Read the serial into a single string (not an array) and then split that string based on cr/lf first. Perform a find to see if there is a cr/lf before trying to parse.
Then split each of the read lines with the comma.

Option 3 :
Adjust the code so that it has a timing delay to syncronise with the sender so that the read is after you know the send has occured + delay and you will always have a full buffer / line.
Reply
#23
Daromer: Slowing down the reading kinda defeats the purpose. I'm working towards reading at at least 800 times a second. So I need to get this figured out
I have no problem reading the lines. They are just coming in much slower than they should be

completelycharged:
1) all lines are coming in complete with the new code. I have no issues with it now.
2) I have done this multiple times reading into a single string. The latest code is not in an array when reading from the serial. That part is fine
3) I had a timer delay, and that makes no difference

I think you guys missed the point I was making. When I reading serial buffer as an array, I could read 3-4 lines per scan thru the python loop. This equates to about a read every 1/8 of a second, which is what the arduino nano is sending at a rate of.
When I read the serial buffer 1 line at a time, it drastically slows down the reading, by at least 2 times as long to process.
In a 1/4 second with the old code, I could get 4 - 5 entries, even tho there were errors sporadically. Even with the error, it was only 1 entry out of about 20 or so, and represented about 1/8 of a second.
With the new code, reading 1 line at a time, I'm missing over 1/2 of my entries from the arduino.

For speed, I need to read the data from the serial in an array, unless there's a faster way.
Right now I'm sending data from the arduino 1/8 a second, or 125ms intervals. Later on, I'm going to be needing to send data at a rate close to 1.25ms.

Maybe what I'll try doing is changing the arduino code to send batches of entries, instead of 1 entry at a time. Every 250ms, send a batch of entries.
Proceed with caution. Knowledge is Power! Literally! Cool 
Knowledge is Power; Absolute Knowledge is Absolutely Shocking!
Certified 18650 Cell Reclamation Technician

Please come join in general chit-chat and randomness at https://discord.gg/c7gJ5uA
(this chat is not directly affiliated with SecondLifeStorage)
Reply
#24
With limited processing power sometimes the high level coding has a very big overhead on how it actually processes what we see as simple operations. When reading one line at a time the internal code is performing a find, then a reallocation of new memory, copy over the bytes upto the first find, reallocate new memory, copy over the unused bytes into a new buffer, deallocate the first buffer. Add onto this another alloc / copy / dealloc if your appending the data into another string.

The byte by byte route can work a lot more efficient if you pre-allocate a set memory (array) and write the data into the array byte by byte and then read out the same memory space as a string. This avoids new memory being allocated/copied and is just byte shuffling. Even easire and faster if the incomming data is a fixed length as you can then just read say 4 bytes of memory as a number. CR/LF resents the new byte position to 0.

High level code is nice to write but sometimes the overhead of nice is slow.

Batch sending is potentially only avoiding the issue now which will come back and bite you when you try and run at 1.25ms.

The other thing to conside is the upd send or database write timing impact on the serial hardware buffer size / interrupt handling. Last thing you want is for incomming bytes to be lost due to a buffer overflow, then you have a whole host of new issues....

I had to resort to machine code to calculate the pulse timing from GPS (1uS) and then your counting how many clock cycles (at 20MHz) each instruction takes to work out the exact clock cycle the pulse arrived, that was R&D in 1999.....

The larger batch sizes sometimes end up being exponentially long to process due to the multiple memory allocation / copy / deallocation on larger strings.

Just having a look, say 40 bytes per sample, if your going to 1.25mS or 800 x 40 bytes x 8 bits, your data rate needs to be a minmum of 256kbit....
You need to transmit the values as byte encoded or raw byte values and not strings....
Reply
#25
800 lines per second.. WHY? Why do you need to pool that much data every second to monitor some electricity? I might have missed the point but if you plan on that high speed you need to dig into what I have said a couple time in terms of looking into how you fetch the data. You NEED to make sure you fetch all data properly and always look for a start and end bit. At that HIGH speed you have toons of things that can cause issues im afraid.


I must be missing something because you talk about 800 reads per second and then a read "every 1/8th" of a second? Smile I have many times started like you and wanted max performance and in the end had to back off.

I have always learned that when starting with serial you need to start slow and then ramp it out unless you actually look into the limits of the serial protocol itself.
Batchsending is generally not recommended on an Arduino due to limited of processing power and memory. Those small units works better to just send the data prelocated as completelycharged said. Pre allocate a buffer. Fill it with data and send it. Strings are the worst enemy of arduinos in high speed applications.

On the other side wait for data to appear. Check the data so you have it all. When you KNOW that you have it all then send it for processing.

loop all over:
While (serial port have data)
if (data is not done)
buffer = buffer+ read data from serial
else
process data
fi
done
The Ultimate DIY Solar and build place
YouTube / Forum system setup / My webpage  Diy Tech & Repairs

Current: 10kW Mpp Hybrid | 4kW PIP4048 | 2x PCM60x | 83kWh LiFePo4 | 10kWh 14s 18650 |  66*260W Poly
Upcoming: 14S 18650~30kWh | Automatic trip breakers, and alot more
Reply
#26
My early coding.... build, change, adapt, bin, start again. Rinse repeat.

Main point, don't give up....
Reply
#27
Yeah giving up is never an option Big Grin
The Ultimate DIY Solar and build place
YouTube / Forum system setup / My webpage  Diy Tech & Repairs

Current: 10kW Mpp Hybrid | 4kW PIP4048 | 2x PCM60x | 83kWh LiFePo4 | 10kWh 14s 18650 |  66*260W Poly
Upcoming: 14S 18650~30kWh | Automatic trip breakers, and alot more
Reply
#28
Reading at 800+ times/second allows for you to do very detailed analysis of your power consumption. You can see when certain devices come on and actually see their power ripple in the data. This can also alert you to things like a failing a electric pilot lite on a gas burner, capacitor on a fridge/aircon/etc, or any other types of loads that can wear down over time. Each device has a particular ripple it creates on the line.

I'm basically replicating this setup:



If I dump the data directly to serial without any type of parsing, I get no errors. When I mean directly, it's not in anything but the main loop with a serial.println(buf) command. It'll run just fine, no errors, every line. And I can read it directly through the python script with the print(serial.readline()), again, with no errors, no problem.
But for some odd reason, it slows down to printing out at a 1/4 per second, or so. I don't get that. Why would it slow down when it's doing less work?

I'm probably gonna have to wait until I get my esp8266/esp32 based one built and use that. Then I don't have to use any serial printing at all. I can just send it directly to the influxdb.
Proceed with caution. Knowledge is Power! Literally! Cool 
Knowledge is Power; Absolute Knowledge is Absolutely Shocking!
Certified 18650 Cell Reclamation Technician

Please come join in general chit-chat and randomness at https://discord.gg/c7gJ5uA
(this chat is not directly affiliated with SecondLifeStorage)
Reply
#29
I suppose I could rewrite the reading code to do a serial.read() instead. That reads 1 byte at a time.

Will see what I get....
Proceed with caution. Knowledge is Power! Literally! Cool 
Knowledge is Power; Absolute Knowledge is Absolutely Shocking!
Certified 18650 Cell Reclamation Technician

Please come join in general chit-chat and randomness at https://discord.gg/c7gJ5uA
(this chat is not directly affiliated with SecondLifeStorage)
Reply
#30
When does it slow down? Im not grasping what you change when it slows down? Show the code you add when it slows down?
Reading each byte should not be needed. But as said 800 times per seconds is bloody darn fast so you better have a good serial connection running Smile

Can you show me the code when it reads 800 per second and how you verify that? Ie count incoming lines to 800 and then print "800 reached" ?  
Same add here the code when you have issues. (I know you added earlier but I never got what was what)

Next thing is that saving 800 values per second in an influx db will demand a very fast influxdb. You will need to have a caching layer in front of that or batching it up. Influx on nice fast hardware can do easy several 100 000 values per second but you need to send them in big batches. On a raspberry pi you wont get fraction of it but still 800 should be doable in good batching.

Edit: Influx here when i parse around 500 values per second i need to add a delay to influx. I think i can reach 100 values per second MAX before influx times out. 4 cores and 8 gig ram. Though in this case the influx gets tons of other values from other hosts. Im at around 5000 values per second total without batching so its choked.
The Ultimate DIY Solar and build place
YouTube / Forum system setup / My webpage  Diy Tech & Repairs

Current: 10kW Mpp Hybrid | 4kW PIP4048 | 2x PCM60x | 83kWh LiFePo4 | 10kWh 14s 18650 |  66*260W Poly
Upcoming: 14S 18650~30kWh | Automatic trip breakers, and alot more
Reply


Who read this thread?
48 User(s) read this thread:
raccooon (11-28-2018, 11:28 PM), Riplash (10-14-2018, 06:11 PM), Walde (10-07-2018, 05:32 PM), Bubba (10-14-2018, 06:50 PM), duwdu (11-10-2018, 05:29 AM), SilverNodashi (09-30-2018, 10:35 AM), iomagico (10-13-2018, 06:41 PM), Ibiza (10-04-2018, 06:34 PM), completelycharged (10-08-2018, 09:16 PM), Redpacket (10-13-2018, 01:12 PM), gpn (09-28-2018, 06:32 PM), w0067814 (09-30-2018, 03:08 PM), michaell (10-02-2018, 06:40 PM), cmmurphyj (11-08-2018, 02:52 PM), 5buBZMKeJZgapTGsbGzKf (10-08-2018, 01:58 PM), talaldki (09-30-2018, 06:01 PM), Wolf (11-21-2018, 04:21 PM), Carl (12-03-2018, 12:39 AM), Beholder (10-08-2018, 07:18 PM), lordxenu (10-12-2018, 04:53 PM), emuland-metroman (10-07-2018, 11:17 AM), chuckp (10-07-2018, 04:28 PM), yoeri_w (09-29-2018, 10:04 AM), rtgunner (10-07-2018, 04:32 PM), brwainer (10-14-2018, 10:06 AM), wim (10-14-2018, 06:14 PM), jdeadman (10-15-2018, 12:10 AM), Mike C (10-01-2018, 09:31 PM), Frnandu Martiński (10-04-2018, 01:38 PM), Franky Beuselinck (10-01-2018, 01:12 PM), PAF (10-03-2018, 01:41 PM), Korishan (10-14-2018, 06:21 PM), Majorphill (10-12-2018, 10:13 PM), mike (10-15-2018, 07:38 PM), Sean (10-14-2018, 04:54 PM), frnandu (09-29-2018, 07:54 PM), watts-on (10-14-2018, 06:27 PM), CarelHassink (10-06-2018, 09:51 PM), not2bme (09-29-2018, 12:40 PM), ChrisD5710 (10-12-2018, 04:31 PM), floydR (10-07-2018, 01:02 AM), Geek (10-12-2018, 03:27 AM), Mazlem (09-30-2018, 01:41 AM), tremors (10-09-2018, 09:48 PM), tamkov (11-04-2018, 09:01 PM), Hanssing (10-07-2018, 03:28 PM), jesusangel (10-14-2018, 05:26 PM), daromer (10-23-2018, 07:03 PM)

Forum Jump:


Users browsing this thread: 1 Guest(s)