Almost 2 years ago my neighbors and I participated in a collective buying project for solar panels. Since then my roof has been equipped with 12 solar panels which equates to a maximum rated power of 3000W. Not enough to be totally self sufficient but it comes close (becoming self sufficient was never really the goal anyway).
The panels are connected to an inverter which in turn is connected to the electrical system of the house (through the white box in the green circle in the image below).

IMG_20200705_150115 (2) (Small)_LI.jpg
The system works in a way that from the generated energy, we consume what we need first and whatever remains is fed back into the grid. This is all measured by the smart meter and sent to the utility company, so we can get paid for what we return.
To get an overview of how the system is performing, the inverter sends information to a central system run by the manufacturer of the inverter, in this case Growatt. I think it works the same for a lot of the other manufacturers as well. Growatt gives me access to an environement to look at the data in a number of different ways, both from a website as from an app on my phone. If you would like to see the environment for yourself, Growatt made available several demo plants

It looks nice, works okay, but I don't have access to the actual data myself (other than manually exporting .csv files). After succeeding in getting the data from my smart meter (see earlier parts in this series), the logical next step was to see if/how I could become the owner of my own (real-time) production data while maintaining delivery of the data to the Growatt server as well.

Data is sent to the Growatt sever by using a dongle that is attached to the inverter (it's that black thing in the red circle in the image above with Growatt written on it). At first the company that fitted the panels gave me a dongle which connected to a separate box by using RF technology. The box connected to my network with an ethernet cable and handeled the communication with the Growatt server. In the beginning this worked quite good, but after some changes on the Growatt side is became very unreliable. Apparently the box needed an automatically delivered  firmware update but it never succeeded in installing it. The dongle/box combination was then replaced with a WiFi dongle that simply connects to my wireless network setup. Communication with the server has been stable ever since. 
From examples built by others (on other platforms and with different languages), I gathered the best way to get hold of the data was to get in between the dongle/server communication. The dongle comes with it own setup portal that allows you to set the address of the server the data is sent to. What if I could replace that server address with an address I control? Turns out this indeed gives access to everything the dongle sends! It also means I need to make sense of that data as it just a stream of bytes (again)!

The worker service revisited

To capture the data, I need a long running process. Now, where did I hear that before? Right, in the meter monitor program. So I started by creating a new project based on the worker service again and gave it the very uncreative name of 'GrowatMonitor'. (See part 2 for more information about this type of project). The bulk of the work is done in the InverterMonitor class. An instance of that class is added to the worker service by dependency injection and then executed from the ExecuteAsync method of the worker service:

protected override async Task ExecuteAsync(CancellationToken cancellationToken)
	while (!cancellationToken.IsCancellationRequested)
		_logger.LogInformation("Worker executing at: {time}", DateTimeOffset.Now);
		Console.OutputEncoding = Encoding.Default;

		if (Utils.IsDaylight(_config.Latitude, _config.Longitude))
			await _monitor.Run();

		// Wait for 5 minutes
		Console.WriteLine("Sleeping for 5 minutes...");
		await Task.Delay(300000, cancellationToken);

A thing I'm still working on is only running the monitor when the panels are actually producing energy (i.e when there is daylight). I have all the code in place but have not succeeded yet in starting/stopping the underlying service successfully. 

The InverterMonitor

As said, the important work is done in the InverterMonitor class. To be able to sit in between the inverter and the manufacturer's server, this class effectively needs to be a proxy server. Of course, there are a lot of C# proxy server samples out there, but none fitted my purposes exactly. This has to do with the fact that the relay of information is not always 1 on 1. With some type of interactions, the inverter sends 15 seperate messages to the server and the server only sends 1 confirmation message back to the inverter. Fortunately, a lot of people have already been working on deciphering the fraffic that is sent back and forth. The document at was of great help there. The monitor is executed by calling it's Run method which then does the following things:

  1. Create and start a TcpListener on the address and port the dongle conects to. In this case I did not change the port and used the ip address of the Pi
  2. Receive the bytes from the listener (in a while loop)
  3. Process the bytes received
  4. When the loop is exited, gracefully shut down listener and other resources

This can of course all be found in the source code on my GitHub account. In the receive step, Sockets are being used to receive the bytes from either the inverter or the Growatt server. A boolean variable (_listenToInverter) combined with a variable that keeps track of the previous received message type are used to determine which socket we're receiving from and where to relay the message to. That, of course, only after we have peeked inside of the message and stored it's data somewhere safe.

Looking at the data is done in the process step, which creates a Message out of the bytes received. Defining a class for these messages and decoding the contents of the messages based on the bytes was definitely the hardest part of this exercise. The protocol description mentioned above helped, but there were a lot of intricacies involved in finding the right starting point and length of bytes to read from the stream. There are a number of (non-C#) samples out there, but that combined with the number of different protocol versions and the number of different hardware solutions, made it a bit of a scavenger hunt. Regardless of the version, a message always contains a header and a body. In newer versions, like mine, there is also a CRC. Just as with the messages coming from the smart meter, those coming form the inverter and Growatt server are 'signed' with a CRC code. It turned out I could re-use the code written in the smart meter monitor. I therefore separated out the source code for the CRC calculation and created a NuGet package out of it. I didn't actually publish it in the NuGet feed but just installed in in both projects as a local package. Both the source and the package are available on my GitHub account.

For parsing out the actual information out of the header and the body, the C#8 range operator and index from end operator came in very handy. Instead of doing:

var body[] = new Byte[4096];
for (int i=8; i < buffer.ByteLength-2; i++)
    body[i - 8] = buffer[i];

to get all the bytes starting at position 8 up to the 2nd to last byte (i.e. the actual body of a message excluding the header and the CRC). I can now just do:

var body = buffer[8..^2];

Considering that I needed to do these kind of array manipulations a lot, you can imagine this helped quite a bit in keeping the code consise. A lot of the PHP samples I found made use of the unpack functionality to assign parsed out parts of the array to variables. Something like that would have been useful to have in C# as well. Instead I used dictionaries combined with the range and index operations mentioned above.

Every message sent in either direction is of a certain type. The protocol and samples I found in Perl or PHP defined way more types than I encountered in my traffic. I put the ones that I found in an enum:

public enum MessageType
	ANNOUNCE = 0x0103,
	CURRDATA = 0x0104,
	HISTDATA = 0x0150,
	PING = 0x0116,
	CONFIG = 0x0118,
	IDENTIFY = 0x0119,
	REBOOT = 0x0120,
	CONFACK = 0x3129

This makes working with the types in the code a lot friendlier. By using a switch statement on the types, I could decode each specific message I received. Decoding a PING message, which is by far the simplest message, looks like this:

private Dictionary<string, object> DecodePing()
	Dictionary<string, object> result = new Dictionary<string, object>(2)
		{ "id", Id },
		{ "datalogger", Content[8..^2] }


	return result;

Which would allow me to proccess that message in the monitor class like below. 

private Message ProcessPing(Dictionary<string, object> data)
	if (_listenToInverter)
		Console.WriteLine($"==> Received ping from {Display(data["datalogger"])}");
		if (_datalogger == null)
			_datalogger = (byte[])data["datalogger"];

	// Send Identify
	if (!_appConfig.ActAsProxy)

	Message reply = Message.Create(MessageType.PING, (byte[])data["datalogger"], (ushort)data["id"]);
	return reply;

The same kind of processing is done for all the other message types as well.

Maybe passing the actual data around as objects is not the best or most beautiful approach but as I know exactly what the data represents (based on the protocol), I can easily cast the objects to the right type.
The Display helper function above just converts the byte array to a human readable string by using a Encoding.Default.GetString((byte[])data) call.The code sample above also shows that you can configure the program to not act as a proxy. In that case each response to a received message has to be sent by the monitor program itself.

Telegrams again?

I was afraid that i would loose a lot of data points while not being connected to the Growatt server in the normal way, but that turned out to be not the case. If the dongle is unable to deliver the data to the Growatt server, it will keep the data in memory. Once the connection is restored, it will transmit that data. So two of the message types contain data that needs to be sniffed and stored. I'm re-using the concept of a telegram class to mould the data into something that can be serialized to a store and just like with the meter monitor I'm using a Storage Table for that. The telegram in this project is a bit simpler than the one in th meter monitor. This time it is basically just a collection of properties that hold the data of the electricity generation at a specific point in time. Storing the telegrams in the Azue Storage Table is done in exactely the same way as described in part 4.


That is it for part 5. Questions? Remarks? Let me know in the comments below! In the next part I'll talk about combining application secrets with running this as a systemd service. Spoiler alert: it aint easy to do that!.


Comments are closed