Elasticsearch

To use Hayei, it is necessary to have an ELK (ElasticSearch, LogStash, Kibana) instance to centralize logs. To ship the logs to the ELK cluster, Hayei uses Fluentd to capture the logs in the source (application or system), pre-process those logs and send them to ELK.

ELK stack is the placeholder of the logs that an application/web app generates. Hayei will retrieve logs for analysis. 

For the ELK stack to do its job, it needs logs from an application/web app. To accomplish this task, an agent is a middle-man between the application and ElasticSearch. Fluentd or any other agent of choice should allow log formatting and add or remove fields, depending on a preferred format. The agent has to be integrated with the application and given the proper credential to ElasticSearch to allow log transfer.

The agent has an essential role in how the logs will arrive at ElasticSearch for later use in Hayei. Logs with a proper format would be easier to analyze.

Login to Hayei

To use Hayei, first of all, the user needs to create an administrator account. Once the account has been completed and validated, the fun part about Hayei will start.

Note: Hayei supports four distinct roles, each one with different permissions. Those roles are Administrator (Can do everything), Developer (Can change some settings), Support (Can read and update), and Guest (Read-only).

First Steps

After logging into Hayei for the first time, the user sees a page that adds the first data source.

In this step, the user needs to input all the information to add a new data source. 

Creating a data source means connecting an ELK cluster to Hayei. The user must provide a display name and optional add an icon to represent the data source. The user also must fill-up the form and provide the schema, host, port, and ELK credentials (user and password). Finally, the last step is to set up the first environment by adding the name for the new environment and the ELK index.

With the information provided by the user, Hayei can start fetching logs and creating tickets according to the logs received. 

Next