Updated Sensor YAML for 2023/24: Adding National Grid’s DFS Dates to Home Assistant

National Grid have brought back the DFS scheme for the winter of 2023-24 and have made some changes to both how the scheme works and the data that is exposed via the API. Most significantly in terms of data, there is now a single combined dataset instead of one for the Live and Test events.

Home Assistant Sensor YAML

I strongly suggest using a package to keep the sensor yaml together rather than cluttering up your configuration file. The code below should be placed in file called national_grid_dfs–202324.yaml located in the directory /homeassistant/packages/National Grid DFS 2023-24/. You can of course pull the relevant code out of the below example and directly insert it in your configuration file.

This following package will create two sensors, one for the start date and time, and one for end date and time for the latest event. This loops through the data checking for the latest event, and then keeps looking to find the start time of the first session, and the end time of the last session in that event. This example is set for Octopus Energy, see the original post Adding National Grid’s DFS Dates to Home Assistant for information on how to change this to another supplier.

national_grid_dfs_202324:
## National Grid Demand Flexibility Service
    sensor:

        # Test Events - Octopus Accepted
        
        # Start Time and Date
      - platform: rest
        resource: "https://api.nationalgrideso.com/api/3/action/datastore_search_sql?sql=SELECT%20COUNT(*)%20OVER%20()%20AS%20_count,%20*%20FROM%20%22ed7019b0-32b7-425c-a2fb-5ba9e32733fb%22%20WHERE%20%22Status%22%20=%20'Accepted'%20AND%20%22Registered%20DFS%20Participant%22%20=%20'OCTOPUS%20ENERGY%20LIMITED'%20ORDER%20BY%20%22_id%22%20ASC%20LIMIT%20100"
        scan_interval: 900
        name: National Grid ESO Latest DFS Event Start 23-24
        unique_id: 2324_national_grid_eso_latest_event_dfs_start
        value_template: >-
            {% set d = value_json['result']['records'][0]['Delivery Date'] %}
            {% if '/' in d %}
                        {% set ns=namespace(start_time = strptime(d+' '+value_json['result']['records'][0]['From'],'%d/%m/%Y %H:%M')) %}
                        {% for record in value_json['result']['records'] %}
                          {% if record['Delivery Date'] == d -%}
                            {% if strptime(d+' '+record['From'],'%d/%m/%Y %H:%M') <= ns.start_time -%}
                              {% set ns.start_time = strptime(d+' '+record['From'],'%d/%m/%Y %H:%M') %}
                            {%- endif %}
                          {%- endif %}
                        {% endfor %}
            {% else %}
                        {% set ns=namespace(start_time = strptime(d+' '+value_json['result']['records'][0]['From'],'%Y-%m-%d %H:%M')) %}
                        {% for record in value_json['result']['records'] %}
                          {% if record['Delivery Date'] == d -%}
                            {% if strptime(d+' '+record['From'],'%Y-%m-%d %H:%M') <= ns.start_time -%}
                              {% set ns.start_time = strptime(d+' '+record['From'],'%Y-%m-%d %H:%M') %}
                            {%- endif %}
                          {%- endif %}
                        {% endfor %}
            {% endif %}         
            {{ns.start_time}}

        # End Time and Date
      - platform: rest
        resource: "https://api.nationalgrideso.com/api/3/action/datastore_search_sql?sql=SELECT%20COUNT(*)%20OVER%20()%20AS%20_count,%20*%20FROM%20%22ed7019b0-32b7-425c-a2fb-5ba9e32733fb%22%20WHERE%20%22Status%22%20=%20'Accepted'%20AND%20%22Registered%20DFS%20Participant%22%20=%20'OCTOPUS%20ENERGY%20LIMITED'%20ORDER%20BY%20%22_id%22%20ASC%20LIMIT%20100"
        scan_interval: 900
        name: National Grid ESO Latest DFS Event End 23-24
        unique_id: 2324_national_grid_eso_latest_event_dfs_end
        value_template: >-
            {% set d = value_json['result']['records'][0]['Delivery Date'] %}
            {% if '/' in d %}
                        {% set ns=namespace(end_time = strptime(d+' '+value_json['result']['records'][0]['To'],'%d/%m/%Y %H:%M')) %}
                        {% for record in value_json['result']['records'] %}
                          {% if record['Delivery Date'] == d -%}
                            {% if strptime(d+' '+record['To'],'%d/%m/%Y %H:%M') >= ns.end_time -%}
                              {% set ns.end_time = strptime(d+' '+record['To'],'%d/%m/%Y %H:%M') %}
                            {%- endif %}
                          {%- endif %}
                        {% endfor %}
            {% else %}
                        {% set ns=namespace(end_time = strptime(d+' '+value_json['result']['records'][0]['To'],'%Y-%m-%d %H:%M')) %}
                        {% for record in value_json['result']['records'] %}
                          {% if record['Delivery Date'] == d -%}
                            {% if strptime(d+' '+record['To'],'%Y-%m-%d %H:%M') >= ns.end_time -%}
                              {% set ns.end_time = strptime(d+' '+record['To'],'%Y-%m-%d %H:%M') %}
                            {%- endif %}
                          {%- endif %}
                        {% endfor %}
            {% endif %}         
            {{ns.end_time}}

Using the Sensors

Now that you’ve got the data, its possible to create automations in Home Assistant to turn off any unnecessary devices at the start time of the event, and turn them back on at the end. To do this, I created 2 date-time helpers which trigger such an automation. To update the date-time helper you need to create an automation that runs on a regular basis, or is triggered by the refreshing of the sensors. You can use similar to the below in the action section of an automation. Repeat this for each of the helpers and their corresponding sensors.

      - service: input_datetime.set_datetime
        data:
          datetime: >-
            {{
            strptime(states('sensor.2324_national_grid_eso_latest_event_dfs_start'),
            '%Y-%m-%d %H:%M:%S') }}
        target:
          entity_id: input_datetime.dfs_event_start

You may also like...

8 Responses

  1. Mark Berry says:

    Any plans to update this for this years scheme? I’ve updated it myself with the new api URLs etc. Not sure what I’ve done wrong as the new sensors I’ve tried to create are just listed as unavailable!

    • Jamie says:

      Hi Mark, I’ve just put out an updated initial version of the 24-25 YAML. Looks like National Grid have included Seconds, rather than just the Hours and Minutes in their datetime fields in the new dataset, so I’ve updated to reflect this inclusion. As there have been no ‘proper’ events yet, I can’t totally guarantee it’ll work, although I’ve tested it with the AXLE ENERGY LIMITED onboarding event that happened this week, and it returned sensible data. As the YAML is set to look for Octopus Energy events, it’ll currently return unavailable for the values until the first event they’re included in.

  2. Jason says:

    Hi Jamie,

    I’m relatively new to HA and trying to get packages setup and working for the first time.

    When I first tried to follow your guide I got an invalid slug error message when I tried to restart HA. I think I have fixed that: in the text in your article it says “The code below should be placed in file called national_grid_dfs–202324.yaml located” which I copied and pasted but I think it should be an underscore not a hyphen/dash before the 202324 as per line 1 of the YAML?

    Anyway, that got rid of that error for me. Now when I try to restart HA I get a different error message: “Failed to reload configuration exceptions must derive from BaseException”

    I didn’t have a homeassistant: block at the start of my config.yaml, the first section was default_config: so I just inserted the following ahead of it

    homeassistant:
    packages: !include_dir_named packages/

    Any ideas?

    Many thanks for the blog & keep up the good work!

    • Jamie says:

      Assuming you’re using a folder called packages located in the same directory as the config.yaml file, then the following should work at the beginning of the config file:

      homeassistant:
      packages: !include_dir_merge_named packages/

      Regarding the difference in names, as I’ve used dir_merge_named rather than dir_merged, it means that it ignores the folder name and uses the specified name at the top of the package file to reference sensors etc. This may be why I didn’t spot the difference in names!

  3. marc says:

    I have just set this up now, after an event. It has not created any sensors in home assistant that I can find. I am looking in developer tools for sensor.2324_national_grid_eso_latest_event_dfs_start. Is this expected behavior and will create them once a session is planned or is this an error on my part?

    • Jamie says:

      Hi marc – it should create the sensor once you’ve restarted HA / reloaded the yaml, however the sensor won’t have a value until the next session is posted.

  4. Bruce says:

    Thanks !

  5. Thomas says:

    Thank you very much for this!

Leave a Reply

Your email address will not be published. Required fields are marked *