Git Product home page Git Product logo

gridprotectionalliance / openhistorian Goto Github PK

View Code? Open in Web Editor NEW
170.0 46.0 48.0 2.27 GB

The Open Source Time-Series Data Historian

License: MIT License

Batchfile 0.06% C# 21.74% PLSQL 0.47% CSS 0.15% JavaScript 7.82% Shell 0.01% HTML 3.91% TypeScript 63.44% PowerShell 0.01% Rich Text Format 0.15% TSQL 1.08% SCSS 0.87% ASP.NET 0.01% Go 0.01% CUE 0.30%
time-series historian data-mining data-storage data-stream

openhistorian's Introduction

LogoBanner

openHistorian

CodeQL

The openHistorian is a back office system designed to efficiently integrate and archive process control data, e.g., SCADA, synchrophasor, digital fault recorder or any other time-series data used to support process operations.

The openHistorian is optimized to store and retrieve large volumes of time-series data quickly and efficiently, including high-resolution sub-second information that is measured very rapidly, e.g., many thousands of times per second.

openHistorian Web Interface

Overview

The openHistorian 2 is built using the GSF SNAPdb Engine - a key/value pair archiving technology developed to significantly improve the ability to archive extremely large volumes of real-time streaming data and directly serve the data to consuming applications and systems.

Through use of the SNAPdb Engine, the openHistorian inherits very fast performance with very low lag-time for data insertion. The openHistorian 2 is a time-series implementation of the SNABdb engine where the "key" is a tuple of time and measurement ID, and the "value" is the stored data - which can be most any data type and associated flags.

The system comes with a high-speed API that interacts with an in-memory cache for very high speed extraction of near real-time data. The archive files produced by the openHistorian are ACID Compliant which create a very durable and consistent file structure that is resistant to data corruption. Internally the data structure is based on a B+ Tree that allows out-of-order data insertion.

The openHistorian service also hosts the GSF Time-Series Library (TSL), creating an ideal platform for integrating streaming time-series data processing in real-time:

openHistorian Overview

Three utilities are currently available to assist in using the openHistorian 2. They are automatically installed alongside openHistorian.

  • Data Migration Utility - Converts openHistorian 1.0 / DatAWAre Archives to openHistorian 2.0 Format - View Screen Shot
  • Data Trending Tool - Queries Selected Historical Data for Visual Trending Using a Provided Date/Time Range - View Screen Shot
  • Data Extraction Utility - Queries Selected Historian Data for Export to a CSV File Using a Provided Date/Time Range - View Screen Shot

Where openHistorian Fits In: Where it fits in

Documentation and Support

  • Documentation for openHistorian can be found in the openHistorian wiki.
  • Get in contact with our development team on our new discussion boards.
  • View old discussion board topics here.

Deployment

For detailed instructions on deploying the openHistorian, see the installation guide

  1. Make sure your system meets all the requirements below.
  • Choose a download below.
  • Unzip, if necessary.
  • Run openHistorianSetup.msi.
  • Follow the wizard.
  • Enjoy.

Requirements

  • .NET 4.6 or higher.
  • 64-bit Windows 7 or newer.
  • HTML 5 capable browser.
  • Database management system such as:
    • SQL Server (Express version is fine)
    • MySQL
    • Oracle
    • PostgreSQL
    • SQLite* (included, no extra install required)

* Not recommended for large deployments.

Downloads

  • Download a stable release here.
  • Download the nightly build here.

Contributing

If you would like to contribute please:

  • Read our styleguide.
  • Fork the repository.
  • Work your magic.
  • Create a pull request.

License

openHistorian is licensed under the MIT License.

openhistorian's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

openhistorian's Issues

openhistorian storage

I need help for create a remote openhistorian server.
I have two servers with openpdc e openhistorian, the servers have the same settings and they work in windows 2008 failover cluster.

  • Can I create a remote openhistorian server for storage data ?
  • Looking at failover server, can I use two server for Load balancing Cluster ?

openhistorian

Thanks
Pietro

Archive files are missing data

Hi,

We found some of PMU's data are not archived to d2(i) files. Some of the historical data can't be found no matter via API or 'Historian Data Viewer'. The missing data may last from 1 second to around 10 seconds, and will happen several times every hour.
Then we added an action to openHistorian to count how many data are received in every frame. It shows all data and all frames are received by the action, which means openHistorian received the data. But some of the frames are not archived. We retrieved the data several hours later than the time when the data were received by openHistorian, but some of them can't be found.
In a short word, the data has 36,000 frames/hour, and all frames are received by openHistorian, but around 400 frames/hour in average can't be found via API or 'HistorianView.exe'
What should we do to fix this issue? Could it be something wrong with configurations?
We are running openHistorian 2.0.415 on Windows Server 2012 with .Net framework 4.5 and openHistorian 2.1 release on Windows 10 with .NET framework 4.6. Both of them are suffering missing data.

Best wishes,
Wenpeng

Initializtion Issue

Hi,
I was trying to write an action adapter to handle the FNET event. I put the built adapter under the openhistorian V2.2.2.0 directory and I saw it in the openhistorian manager. Then I configured the parameters and click 'initialize'. However, there was no message on the console although i had written the onStatusMessage() in the initialize("xxxx"). I also wrote a log logic to check, but it seemed the logics i wrote in the initialize() were not called. In order to ensure my code was right, I debugged the same code (but different dependencies) for the latest Project Alpha V0.2.107.0. This time, it seemed the initialize() ran well, and so did the publishFrame() logic. I could not figure out where was the problem. Any suggestions?
Best,
Ferriad Wang
FNET, UTK

Data Validation

Hi,

I have a question regarding the .d2 file naming. It seems the openHistorian cannot prevent a too old or too new data from coming in? Please see attached.
image

Here in the image, you can find some .d2 file contains the timestamp many years later, but I would assume this should not be inserted into the database?

Multiple indexes

To improve the performance of queries in a large-scale system, it would be beneficial to have a way to store aggregates or filtered data sets in separate archives. The proposed solution would be to add a rule-based engine to the LocalOutputAdapter to enable the user to set up multiple archive instances and route data to the appropriate archive.

Output Streams

why openhistorian 2.0.347 does not handle "concentrator output streams" ?

Service unexpected stop

I had an exception that caused the openHistorian service to stop.
After my initial investigation, I found there were several TLS issues reported by my Microsoft event viewer
Log Name: System
Source: Service Control Manager
Date: 12/2/2019 5:56:48 PM
Event ID: 7034
Task Category: None
Level: Error
Keywords: Classic
User: N/A
Computer: server
Description:
The openHistorian service terminated unexpectedly. It has done this 1 time(s).
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Service Control Manager" Guid="{555908d1-a6d7-4695-8e1e-26931d2012f4}" EventSourceName="Service Control Manager" /> <EventID Qualifiers="49152">7034</EventID> <Version>0</Version> <Level>2</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x8080000000000000</Keywords> <TimeCreated SystemTime="2019-12-02T22:56:48.514528500Z" /> <EventRecordID>96351</EventRecordID> <Correlation /> <Execution ProcessID="1176" ThreadID="32240" /> <Channel>System</Channel> <Computer>server</Computer> <Security /> </System> <EventData> <Data Name="param1">openHistorian</Data> <Data Name="param2">1</Data> <Binary>6F00700065006E0048006900730074006F007200690061006E000000</Binary> </EventData> </Event>
Log Name: System
Source: Schannel
Date: 12/2/2019 5:56:46 PM
Event ID: 36874
Task Category: None
Level: Error
Keywords:
User: SYSTEM
Computer: server
Description:
An TLS 1.1 connection request was received from a remote client application, but none of the cipher suites supported by the client application are supported by the server. The TLS connection request has failed.
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Schannel" Guid="{1F678132-5938-4686-9FDC-C8FF68F15C85}" /> <EventID>36874</EventID> <Version>0</Version> <Level>2</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x8000000000000000</Keywords> <TimeCreated SystemTime="2019-12-02T22:56:46.163853100Z" /> <EventRecordID>96350</EventRecordID> <Correlation ActivityID="{669EAB56-57E8-0004-58AB-9E66E857D501}" /> <Execution ProcessID="1192" ThreadID="8508" /> <Channel>System</Channel> <Computer>server</Computer> <Security UserID="S-1-5-18" /> </System> <EventData> <Data Name="Protocol">TLS 1.1</Data> </EventData> </Event>
Log Name: System
Source: Schannel
Date: 12/2/2019 5:56:37 PM
Event ID: 36874
Task Category: None
Level: Error
Keywords:
User: SYSTEM
Computer: server
Description:
An TLS 1.0 connection request was received from a remote client application, but none of the cipher suites supported by the client application are supported by the server. The TLS connection request has failed.
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Schannel" Guid="{1F678132-5938-4686-9FDC-C8FF68F15C85}" /> <EventID>36874</EventID> <Version>0</Version> <Level>2</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x8000000000000000</Keywords> <TimeCreated SystemTime="2019-12-02T22:56:37.048801300Z" /> <EventRecordID>96349</EventRecordID> <Correlation ActivityID="{669EAB56-57E8-0004-58AB-9E66E857D501}" /> <Execution ProcessID="1192" ThreadID="8508" /> <Channel>System</Channel> <Computer>server</Computer> <Security UserID="S-1-5-18" /> </System> <EventData> <Data Name="Protocol">TLS 1.0</Data> </EventData> </Event>
Log Name: System
Source: Schannel
Date: 12/2/2019 5:56:27 PM
Event ID: 36874
Task Category: None
Level: Error
Keywords:
User: SYSTEM
Computer: server
Description:
An TLS 1.2 connection request was received from a remote client application, but none of the cipher suites supported by the client application are supported by the server. The TLS connection request has failed.
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Schannel" Guid="{1F678132-5938-4686-9FDC-C8FF68F15C85}" /> <EventID>36874</EventID> <Version>0</Version> <Level>2</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x8000000000000000</Keywords> <TimeCreated SystemTime="2019-12-02T22:56:27.855590700Z" /> <EventRecordID>96348</EventRecordID> <Correlation ActivityID="{669EAB56-57E8-0004-58AB-9E66E857D501}" /> <Execution ProcessID="1192" ThreadID="8508" /> <Channel>System</Channel> <Computer>server</Computer> <Security UserID="S-1-5-18" /> </System> <EventData> <Data Name="Protocol">TLS 1.2</Data> </EventData> </Event>
Log Name: System
Source: Schannel
Date: 12/2/2019 5:56:17 PM
Event ID: 36874
Task Category: None
Level: Error
Keywords:
User: SYSTEM
Computer: server
Description:
An TLS 1.1 connection request was received from a remote client application, but none of the cipher suites supported by the client application are supported by the server. The TLS connection request has failed.
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Schannel" Guid="{1F678132-5938-4686-9FDC-C8FF68F15C85}" /> <EventID>36874</EventID> <Version>0</Version> <Level>2</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x8000000000000000</Keywords> <TimeCreated SystemTime="2019-12-02T22:56:17.729538200Z" /> <EventRecordID>96347</EventRecordID> <Correlation ActivityID="{669EAB56-57E8-0004-58AB-9E66E857D501}" /> <Execution ProcessID="1192" ThreadID="8508" /> <Channel>System</Channel> <Computer>server</Computer> <Security UserID="S-1-5-18" /> </System> <EventData> <Data Name="Protocol">TLS 1.1</Data> </EventData> </Event>
Log Name: System
Source: Schannel
Date: 12/2/2019 5:56:17 PM
Event ID: 36874
Task Category: None
Level: Error
Keywords:
User: SYSTEM
Computer: server
Description:
An TLS 1.1 connection request was received from a remote client application, but none of the cipher suites supported by the client application are supported by the server. The TLS connection request has failed.
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Schannel" Guid="{1F678132-5938-4686-9FDC-C8FF68F15C85}" /> <EventID>36874</EventID> <Version>0</Version> <Level>2</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x8000000000000000</Keywords> <TimeCreated SystemTime="2019-12-02T22:56:17.712391600Z" /> <EventRecordID>96346</EventRecordID> <Correlation ActivityID="{669EAB56-57E8-0004-58AB-9E66E857D501}" /> <Execution ProcessID="1192" ThreadID="8508" /> <Channel>System</Channel> <Computer>server</Computer> <Security UserID="S-1-5-18" /> </System> <EventData> <Data Name="Protocol">TLS 1.1</Data> </EventData> </Event>

At the same time on the openHistorian side, I found logs below

Screen Shot 2019-12-03 at 3 00 09 PM

Any ideas from the logs?

Grafana and OH User Synchronization requires log-on

Adding a user to the OH and running /grafana/synchUsers won't result in the user being synchronized to Grafana.

image

However if the user account logs into Grafana for the first time it will automatically be synched from the OH.

API open Historian

I am currently working on openHistorian for POC and I'd like to send the data stored in openHistorian to a time series database, influxDB.
I can not populate the influxdb database.
Is there any guide that I can follow to fix the problem?
Is there any guide that explains step by step the process of "data transmission" from openHistorian to another time-series database?

Huge Memory usage

When you set e new internal subscriber in opehhistorian (version 2.0.347.0) require high memory usage.

openhistorian_ram

Thanks,
Pietro

Full Documentation for OpenHistorian

Dear All,

Required Documentation of OpenHistorian which guides any body how can we use OpenHistorian with screen shot. How can we configure a Devices and how it works, so it gives clarity to User at very much extent. If anybody have such documentation then please share it so i can understand the Flow and How can we use OpenHistorian in best possible way

Thanks in Advance

Data missing on export data from openhistorian webmanager

We have Openhistorian connected to openpdc with Gateway transport (localhost:6165).
When you save data from openhistorian webmanager, for example 2 days (6 phasors), we have this issues:

  1. Openhistorian: connection to openpdc interrupted
  2. Openhistorian: Loss data caused by point 1
  3. Openpdc: Concentrator output stream interrupted
  4. Openpdc: Custom output stream interrupted

Data loss on grafana
image

openhistorian logs
image

[3/26/2019 11:21:13 AM] [WAMS] No data received in 5.0 seconds, restarting connect cycle...
[3/26/2019 11:21:13 AM] [WAMS] Data subscriber command channel connection to publisher was terminated.
[3/26/2019 11:21:14 AM] [SNAPENGINE] Pending Tables V1: 6 V2: 1 V3: 0
[3/26/2019 11:21:17 AM] [WAMS] Attempting connection to tcp://localhost:6165...
[3/26/2019 11:21:17 AM] [WAMS] Attempting command channel connection to publisher...
[3/26/2019 11:21:17 AM] [WAMS] Connection established.
[3/26/2019 11:21:17 AM] [WAMS] Data subscriber command channel connection to publisher was established.
[3/26/2019 11:21:18 AM] [WAMS] TSSC algorithm reset before sequence number: 30933
[3/26/2019 11:21:18 AM] [WAMS] Success code received in response to server command "Subscribe": Client subscribed as compact unsynchronized with 5457 signals.

Adding New Device (hardware PMU)

Hello Everyone,

I am trying to ADD DEVICE in openHistorian 2.0 portal, so that I can retrieve data seamlessly instead of using the conventional Historian Playback Utility software.

The problem I have is that it does not allow me to Save Device (not clickable), even after I assign all the Phase quantities in red in picture below!

The connection file is for a hardware PMU that has been tested with PMU tester and is in use with OpenPDC2.7. I appreciate any help on this.

OpenHistorian2_AddingDeviceIssue

Thanks,
Ali

OpenHistorian Action Adapter Unknown Stop

Hello,

Recently we have seen an unknown stop among our action adapters. we had 4 action adpater instances that using a same .dll file. In the mid April, one of them stopped working for about 8 hours until we re-initialized it, but at the same time the other three adpter instances worked well. Not sure if you know this issue or you have fixed it in the later versions.

Regards,
Ferriad
FNET@UTK

connecting the openhistorian 2 with PMU

Hi
I was able to connect to the PMU in SEL-421 relay using the pmu connection tester, but I wasn't able to connect the open historian 2, its giving me this error (connection attempt failed)
would you please show me the right steps to add a new input device, I tried the synchrophasor walk through wizard but nothing changed

Error setting output stream

when you set output stream isn't possible to insert pmu with device wizard, link seems disabled

openhistorian verison 2.0.347.0

openhistorian_device_wizard

PIAdapter not loaded

Hi, I am using the openHistorian 2.4 release version and I found the Pi adapter no longer appears in the openHistorian Manager. However, I do find the PIAdapter.dll exists in the installation folder. Not sure if you are aware of this issue.
image
image

linux support

Hi,

I am currently working on openHistorian for POC and wanted to check if i can put efforts on creating Linux based deployment. ( with Mono )

Any efforts done in the past?
Which components of openHistorian are purely native to windows?

Thanks

API to query measurement details

https://github.com/GridProtectionAlliance/SIEGate/blob/master/Source/Documentation/wiki/Creating_Internal_Gateway_Connections.md

According to the example API showed, I create a API to read data from OpenHistorian. However, the data can only be read according to PointID, which is 1, 2, 3, .....
Because it is really hard to tell which signal it is according to PointID, more information is needed.
Is there a way that the details of measurements can be queried at the same time? For example , the signal names ( e.g. TVA_TESTDEVICE: ABBS ).

How to make a synchrophasor through TCP package

Hi,

I am trying to model a real PMU by using FDR. However, the TCP package from PMU seems to be a special protocol, i.e., synchrophasor, shown in follows,
image
When I try to model them, the ones from FDR become TCP ones:
image
Is there some special requirement for the PMU TCP package?
By the way, these packages are captured by wireshark,

Best regards,

He
FNET

Device list export fails

In the web manager, clicking 'Export CSV' on the device list will attempt to download 'CsvDownloadHandler.aspx', and then fail.

Machine IP:            fe80::3da8:ae20:b069:fdc7%16
Machine OS:            Microsoft Windows NT 6.2.9200.0

Application Domain:    openHistorian.exe
Assembly Codebase:     C:/Program Files/openHistorian/openHistorian.exe
Assembly Full Name:    openHistorian, Version=2.3.99.0, Culture=neutral, PublicKeyToken=null
Assembly Version:      2.3.99.0
Assembly Build Date:   12/7/2017 12:03:16 AM
.Net Runtime Version:  4.0.30319.42000

Exception Source:      GSF.Web
Exception Type:        System.Security.SecurityException
Exception Message:     Cannot download CSV data: access is denied for user "Undefined", minimum required roles = *.
Exception Target Site: CopyModelAsCsvToStream

---- Stack Trace ----
   GSF.Web.Model.Handlers.CsvDownloadHandler.CopyModelAsCsvToStream(securityPrincipal As SecurityPrincipal, requestParameters As NameValueCollection, responseStream As Stream, flushResponse As Action, cancellationToken As CompatibleCancellationToken)
       openHistorian.exe: N 02296
   GSF.Web.Model.Handlers.<>c__DisplayClass8_0.<ProcessRequestAsync>b__0(stream As Stream, content As HttpContent, context As TransportContext)
       openHistorian.exe: N 00154

Custom adapters and data archiving stopped

Hey guys,

I am using openistorian 2.3.7 version and for some reason, it stopped writing data to the archive and my action adapters were stopped as well. I checked the logs and it seems the system somehow stopped processing the measurements.
image
After a while, it kind of resumed to process the measurements but still abnormal.
image
Is this a known issue?

Fix build tools to update version in SDK project files

Version info can be set in the csproj file within the new SDK project file format. One way to do this easily is like this:
<PropertyGroup<VersionPrefix2.3.306</VersionPrefix</PropertyGroup
(see, for example, .\openHistorian\Source\Tools\SampleFunctions\SampleFunctions.csproj)

Stackoverflow has this reference (https://stackoverflow.com/questions/56512069/how-can-you-share-assembly-info-between-vs-2019-formatted-projects#56517268)

Microsoft has this reference (https://docs.microsoft.com/en-us/visualstudio/msbuild/customize-your-build?view=vs-2019)

Build tools need to be updated to catch version info in SDK project files, or GPA needs to adopt a style guide stating that you are going to keep version info in assemblyinfo files.

The data become "NAN"

I am trying to add a new PMU to Openhistorian.
It can be connected through PMU tester and can be connected to OpenHistorian. However, the data become "NAN".
Then I try to use OpenPDC, the data is correct which really make me confused.

Can anyone help me with this issue?

Best regards,

He Yin

openhistorian_nan data
openpdc

SET SERVER TIME ON WEB MANAGER

Hi,

How Can I set Historian web manager local time local/server time? I noted that while on standard manager the time is aligned with the local for the webmanager is two hours ahaed.
How can I set?

Regards,

Cosimo

How to back fill historical data to openHistorian?

Hi,

Last month we installed openHistorian as the database to archive the data received from FDRs. We have several years' historical data archived in hundreds of Access .mdb files. Now we plan to migrate the historical data from Access to openHistorian. So I have several questions and need some suggestions on how to do it.
The openHistorian we installed uses MySQL as configuration database, and is receiving real-time data from around 100 FDRs. We want to backfill historical data to this openHistorian.

  1. Is developing a custom input adapter the correct way to backfill data? Or is there any other better approach?
  2. Since the openHistorian is receiving real-time data, will backfill historical data impact the real-time data receiving & archiving, or reduce the performance?
  3. Is it doable that we install a new openHistorian server on another computer using exactly the same MySQL configuration database, then backfill data to this new openHistorian and copy all archive files to the openHistorian which is receiving real-time when backfilling is done?

Any suggestion is appreciated!

Best regards,
Wenpeng Yu

Grafana API

May I know how can I use the grafana API for openHistorian. I know you have published it but I failed to find it on the Github.

How to migrate data from one installation of openHistorian to another computer for trending/visualization

Hi,

Your response to the following question is greatly appreciated.

We would like to migrate the required data from one installation of openHistorian to a second computer and be able to trend/visualize the migrated PMU data using Grafana tool on the second computer.

Basically, we would like to know the required files and the step-by-step procedure to achieve the aforementioned data migration.

Thank you,

ARCHIVER FOLDER

Hi guys,

we set openHistorian archiving in the folder in the following:

image

Strangely, two folder are created. The fist one with the current year and the second one related to 1970 year O.o.

Do you idea of what is happened?

Kind regards,

Cosimo

OPENPDC-OPENHISTORIAN INTERNAL SUBSCRIPTION

Hi,

Internal Subscription in OpenHistorian (which permits to transfer data internally from OpenPDC e OpenHIST) is a one way communication from OpenPDC to OpenHIST and not viceversa? Roughly speaking, is OpenPDC which sends data to OpenHIST and not OpenHIST which asks for data to OpenPDC?

Thank you,

Regards

Cosimo

Grafana Abnormality

Hi,
I find the built-in Grafana would fail to grab data from the openHistorian.
I am able to grab the real-time/historian frequency data from the .d2 files and I can see the data is streaming in the historian manager, but I am not able to visualize the data on Grafana. Is there any known issue about this?

Issue with TCP package

Currently we are using FDRs to model PMU frames with IEEE 2011 standard. The FDR will send multiple data/config. frame merged within a single TCP package.
However, it seems that OpenHistorian can not separate the data/config. frame successfully. It will just treat it as a bad data and report error.
Is there any latest and stable version to solve this problem?

Add max archive size setting

Currently, the only way to limit archive size is by the MaximumArchiveDays setting. Having an option to limit the archive to a certain size would simplify storage management considerably.

OPENHISTORIAN NAVIGATION

Good morning,

I'm new of OpenPdc and OpenHistorian, so I ask for your help while reading all the pages of documentation, issues and discussions.
My questions are very simple and I attach them in the following:

  1. Is OpenPdC independent from OpenHistorian? I mean that I noticed that the mask interface is very similar, hence the data saving is ensured even if OpenpdC is not running?

  2. I used "Historian Playback/Export Utility" while the export process is quite stable (except when you choose "Process data at XX samples per second") I'm would know it has an associated data viewer tool which permits to navigate the database ("Historian Playback" name seems intuitive that there is this possibility) as the analyst wants (for instance select all the trigger of overfrequency in the past or similar)

  3. Could you confirm that the "STAT measurements" of OpenPdc are not stored in OpenHistorian? Which quantities/variables are hence stored as default apart PMUs signals? This is a very crucial issue in selecting the most appropriate storage capacity...

Thanks for a kind reply

V

Trend/Export Measurements Timestamps

Hello everyone!
I am currently working on a project that i have connected a PMU device to a local openHistorian instance and it seems that i need your help on this. On the Trend/Export Measurements via OpenHistorian WebUI i am able to declare a start/stop time with mm/yy hr:min:secs.ms but when i export the csv file it the timestamp column is displayed in seconds and not in ms.
So i replaced the format manualy as " d/m/yyyy hh:mm:ss.000".
Is there a way to modify the csv to be exported to have this format initially and not have to and change it manually every time i need to extract data?
It might be something silly but i can't seem to find it anywhere.

ATMEGA2560 Modbus openHistorian display to Grafana

Dear,
I´ve made modbus program on ATMEGA2560 with TCP , I´send 7 Values and I created a device named ATMEGA2560 with HR0,HR1 ..... HR7 Mappings.
If I connect the device I can see all number in Current Value field
modbusconfig

Now my problem is save those data on database and retrive them from grafana to display.

Could anyone help me find documentation to solve this problem?

Many thanks
Regards

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.