Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Script for adding in retrospective data (using mysql/maria) #378

Open
benlumley opened this issue Dec 11, 2023 · 11 comments
Open

Script for adding in retrospective data (using mysql/maria) #378

benlumley opened this issue Dec 11, 2023 · 11 comments
Labels
enhancement New feature or request

Comments

@benlumley
Copy link

My data often doesn't update on the api.glowmarkt.com API this plugin uses for an hour or two; when it does - because this app just uses a sum, you get hours with big peaks on, rather than accurate spread out usage. (I think, from querying other API's; this is caching on glow's side; you can sometimes see more up to date data on different APIs (or in the bright app) than is available on the daily sum).

Anyway - this script will query the hourly data for a day; correct/add it into home assistants statistics and statistics_short_term tables and then update all records ahead of the affected ones so that the statistics add up correctly (HA statistics store a running total; so you need to update records ahead in time of the ones you change).

I've not touched the yearly records; wasn't as interested in them - plus the end result is the same; so they should still tally.

Run it by passing an argument for 'days ago'; or leave blank to do today I've got it running (via ha) as a script every night + have had it like this for a month or so:

shell_command:
  glow_fetch: ./scripts/glow-scrape.py 2; ./scripts/glow-scrape.py 1; ./scripts/glow-scrape.py;

Figured it might be of interest/useful to someone.

You need to find/replace in:
Your metadata_id - look in statistics_meta for sensor.electric_consumption_today
Your database credentials
And then fill in the relevant resource_id, application_id and token in place of the '...' bits in requests.get

And back your database up first!

#!/usr/local/bin/python

import requests
from datetime import datetime, timedelta
import sys
import MySQLdb
import MySQLdb.cursors

METADATA_ID=27

# get day from command line
if (len(sys.argv) < 2):
    days_to_subtract = 0;
else:
    days_to_subtract = int(sys.argv[1]);


db = MySQLdb.connect(host='core-mariadb',
                             user='dbuser',
                             password='dbpass',
                             database='homeassistant',
                             charset='utf8',
                             cursorclass=MySQLdb.cursors.DictCursor);
c = db.cursor()
c2 = db.cursor()

d = datetime.today() - timedelta(days=days_to_subtract)
now = datetime.now();
start = d.replace(hour=0, minute=0, second=0, microsecond=0)
end = d.replace(hour=23, minute=59, second=59, microsecond=0)

print(start.strftime("%Y-%m-%dT%H:%M:%S"))
print(end.strftime("%Y-%m-%dT%H:%M:%S"))

url = 'https://api.glowmarkt.com/api/v0-1/resource/.../readings'
r = requests.get(url,
                 headers={
                     "applicationId":"...",
                     "token":"..."
                }, params={
                        "period":"PT1H",
                        "function":"sum",
                        "from":start.strftime("%Y-%m-%dT%H:%M:%S"),
                        "to":end.strftime("%Y-%m-%dT%H:%M:%S")
                })

data = r.json()

first_time = data['data'][0][0]
last_time = data['data'][-1][0]

# main statistics table
print("main statistics...")
c.execute("SELECT * FROM statistics WHERE metadata_id = %s AND start_ts < %s ORDER BY start_ts DESC LIMIT 1", (METADATA_ID, first_time));
preceding_record = c.fetchone()

print(preceding_record)

state = 0;
# sum =  preceding_record[sum];
sum = preceding_record["sum"];

written = 0
# print the JSON
for record in data['data']:
    if written == 0 and record[1] == 0:
        print("skipping")
        continue

    written = 1
    state = state + record[1];
    sum = sum + record[1];
    print(record[0], round(record[1],4), round(state,4), round(sum,4))

    c.execute("SELECT id FROM statistics WHERE metadata_id = %s AND start_ts = %s", (METADATA_ID, record[0]))
    if c.rowcount == 0 and record[0] < now.timestamp():
        print ("create")
        c.execute("INSERT INTO statistics SET state = %s, sum = %s, metadata_id = %s,  start_ts = %s, created_ts = UNIX_TIMESTAMP()", 
        (round(state, 4),round(sum,4),METADATA_ID, record[0]))
    elif c.rowcount > 0:
        c.execute("UPDATE statistics SET state = %s, sum = %s WHERE metadata_id = %s AND start_ts = %s",
            (round(state, 4),round(sum,4),METADATA_ID, record[0]));

    c.execute("SELECT id FROM statistics_short_term WHERE metadata_id = %s AND start_ts >= %s AND start_ts < %s", (METADATA_ID, record[0], record[0]+3600))
    for row in c:
        print("Updating short term row " + str(row["id"]))
        print (c2.execute("UPDATE statistics_short_term SET sum = %s, state=%s WHERE id = %s", (round(sum,4), round(state,4), row["id"])));            
        
    

c.execute("SELECT * FROM statistics WHERE metadata_id = %s AND start_ts > %s ORDER BY start_ts ASC ", (METADATA_ID, last_time));
for row in c:
    if row["state"] == state:
        # sum stays same
        0
    elif row["state"] < state:
        sum = sum + row["state"];
    else:
        sum = sum + row["state"] - state;
    
    state = row["state"];

    c2.execute("UPDATE statistics SET sum = %s WHERE id = %s", (round(sum,4), row["id"]));


print ("Updating short term after " + str(last_time))
c.execute("SELECT * FROM statistics_short_term WHERE metadata_id = %s AND start_ts >= %s ORDER BY start_ts ASC ", (METADATA_ID, last_time+3600));
for row in c:
    if row["state"] == state:
        # sum stays same
        0
    elif row["state"] < state:
        sum = sum + row["state"];
    else:
        sum = sum + row["state"] - state;
    
    state = row["state"];
    print("Forwards updating short term row " + str(row["id"]))
    c2.execute("UPDATE statistics_short_term SET sum = %s WHERE id = %s", (round(sum,4), row["id"]));


db.commit()
@benlumley benlumley added the enhancement New feature or request label Dec 11, 2023
@Benniepie
Copy link

Thank you for writing this

@AN53808
Copy link

AN53808 commented Jan 19, 2024

Thanks for writing this script. I'm a home assistant novice and keen to give this a try to resolve my inaccurate energy data.
I've worked out which metadata_id I need, but I'm struggling to work out where to look for my resource_id, application_id and token. Any help greatly appreciated. Also, I'm running the standard SQLite db, so does the script need modifying for this? Secondly, if I get it working for my electricity, can I change the metadata_id and resource_id to also run the script for my gas in the same way.
Apologies for all the questions.

@biggeeus
Copy link

Thanks for writing this script. I'm a home assistant novice and keen to give this a try to resolve my inaccurate energy data. I've worked out which metadata_id I need, but I'm struggling to work out where to look for my resource_id, application_id and token. Any help greatly appreciated. Also, I'm running the standard SQLite db, so does the script need modifying for this? Secondly, if I get it working for my electricity, can I change the metadata_id and resource_id to also run the script for my gas in the same way. Apologies for all the questions.

I can answer the first part of your question, I used this guide to get resource_id, application_id an token which allowed me to run the script.
https://glowmarkt.com/home/blogs/extracting-energy-data-from-the-hildebrand-glowmarkt-api-in-5-easy-steps
I too would like to know if I can use it for gas.

@benlumley
Copy link
Author

benlumley commented Jan 30, 2024 via email

@AN53808
Copy link

AN53808 commented Jan 31, 2024

Thanks for the link to the API help guide, since my original post I also discovered the API guide and I managed to get my token, and resource_ids using postman. I've also modified the original script to work with the default sqlite database and I've got it working with my gas usage. I did this by changing the metadata_id and resource_id (passing them in as variables to the script) and it worked just fine. I'm also experimenting with cost data, which seems to work, but I've had to divide the data by 100 and also add in the daily standing charges. I'm still getting a few oddities at the start of the current day, due to the lag in data reporting, but these seem to get mopped up after a day or so.

@biggeeus
Copy link

Just tried updating my gas and it work great. Is there any reason why I can't run this script every hour, to keep the current day correctly populated, such as a limit on the number of API calls ?

@benlumley
Copy link
Author

i found it of little value to do it; i found that the inaccuracies/lag in the glow data that cause the main ha-hildebrand-dcc thing to get gaps in it's data often don't resolve for a good number of hours. but not aware of any reason you couldn't.

@biggeeus
Copy link

Been running it every hour this morning and I see what you mean, if only I could get a Octopus Home Mini I could use that instead but unfortunately I only have a SMETS1 meter and it doesn't work with it.

@hirenshah
Copy link

hirenshah commented May 3, 2024

Is there a way to adapt this script to go through an entire month doing the same thing?

/Edit:

ChatGPT sorted it out for me :)

Updates the data for the last 90 days.

#!/usr/local/bin/python

import subprocess

# Define the path to your existing Python script
existing_script_path = "./glow_scrape.py"

# Loop from 1 to 90
for param in range(1, 91):
    # Execute the existing script with the current parameter
    subprocess.run(["python", existing_script_path, str(param)])

@hirenshah
Copy link

Noticed that the script (or it could be the source data) doesn't cater for British Summer Time. Usage is showing an hour earlier than it should.

@AN53808
Copy link

AN53808 commented May 14, 2024

The DCC energy data is stored and retrieved using UTC time format by the script (without time zone) and the Home Assistant database stores it the same way using a Unix-style timestamp. For me, this works fine during BST because my Home Assistant displays the energy graph usage in my current timezone i.e. BST by offsetting the UTC stored energy data. So no timezone conversion of the actual data is needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants