Skip to main content

Data Analysis

Djuno Data Analysis offers a cloud-based platform for easy integration, visualization, and analysis of real-time data with tools like Grafana and OpenSearch.

Djuno Support avatar
Written by Djuno Support
Updated over a month ago

Overview

The Djuno Data Analysis platform offers a seamless cloud-based solution for users to integrate, visualize, and analyze real-time data with ease. This setup is part of a comprehensive cloud management platform where users can access various data analysis tools like Grafana and OpenSearch. The "Create a Service" button allows users to quickly deploy, configure, and scale their data analysis services to meet their specific needs.

How create a data analysis

When creating a new data analysis service in Djuno Cloud, users are guided through several configuration steps to customize the service to their needs. First, they choose the analysis tool, such as Grafana or OpenSearch, based on their specific requirements. After selecting the tool, users can pick a service plan, including options like Business, Advanced, or Essential. The next step is to choose a hosting region, with options such as Beauharnois, Frankfurt, and London, ensuring optimal performance and compliance. Users then configure the system resources, such as CPU, memory, and storage, with estimated costs for each choice. Finally, users can adjust the number of nodes for better scalability and availability. Once all selections are made, users click Order to deploy the service. This process offers flexibility to meet both technical needs and budget preferences.

After creating a data analysis service, users will find additional options in the cluster overview, such as General Information, Edit, and Delete functionalities.

The Cluster Overview provides key details about the data analysis instance, including the cluster name, unique ID, and the current status, which shows whether the service is active or being created. The service operates on a specified version, such as OpenSearch 2, and follows the chosen service plan (e.g., Essential). Users can view and adjust configurations for CPU, RAM, and storage allocation, with options for upgrading these resources based on their requirements. The instance is hosted in a specific datacenter, such as BHS, and uses first-generation remote storage.

Login information provides the URI and host details for connecting to the data analysis service, along with SSL requirements for secure communication. Users can manage authorized IP addresses to control access to the service.

In the configuration section, users can set maintenance schedules, manage network settings, and configure authorized IP addresses for security. The service is hosted on a public network, and backup and recovery options are available to ensure data protection.

users

This tab allows administrators to manage data analysis users by adding new users and viewing details such as usernames, creation date, and account status. Each user’s entry includes their status (e.g., READY), providing an overview of their current access. The dropdown menu for each user offers options to delete the user, reset their password, view the certificate, or view the access key, giving administrators quick controls for managing user access and ensuring the security of the data analysis service.

Create user:

ACL

The ACL (Access Control List) tab in the data analysis section allows administrators to manage and control user access permissions to specific resources. By enabling ACLs, administrators can define which users have access to particular index models and the level of permission (such as read or write) they possess. To create a new ACL, administrators can add a username, assign the relevant permission, and select the corresponding index model. This ensures that only authorized users have the appropriate level of access to the resources, enhancing the security and control of the data analysis environment.

Create ACL:

In the "Add an ACL" section, administrators can define specific access control rules for users within the data analysis platform. To create an ACL, the administrator selects a user and assigns them a permission level (such as read or write). The index models use a standard matching system, also known as the glob pattern, where '' matches any number of characters and '?' matches any single character. For example, entering "logs_200" will match any index that begins with "logs_200", while "logs_200?" would match indices like "logs_200904" but not "logs_201904". After configuring the user, permission, and index model pattern, the administrator can click "Add an ACL" to apply the access control rule, ensuring that only authorized users can interact with the designated data models.

Indexes

In the "Indexes" tab, administrators can manage the various indexes within the data analysis platform. The tab is divided into two sections: "Models" and "Indexes."

In the "Models" section, you can add a new template, which allows you to define configurations for managing your index models. This includes setting a template for the maximum number of indexes that can be created.

The "Indexes" section lists all the indexes currently present in the system. For each index, details such as the index name, number of shards, replicas, size, document count, and creation date are displayed. For example, indexes like .opensearch-observability and .kibana_1 show information about their structure, storage usage, and when they were created. This overview helps administrators monitor and manage the storage and performance of their data efficiently.

Create template

In the "Add Template" section, administrators can define a model to manage indexes, automatically deleting older ones once a specified threshold is exceeded. The model uses glob patterns, such as "logs*" to match indexes like "logs_foo" or "logs_2025." By setting the maximum number of indexes, administrators can control how many logs are kept before older ones are deleted. This system helps optimize storage and ensures only the necessary data is retained. After configuring the template, click "Create" to save it.

Backups

In the Backup tab, you can manage and view all your data analysis backups. This section provides details about each backup, including the name, location, creation date, expiry date, and status. You can restore or fork any backup as needed, ensuring your service remains operational. The backups are maintained for data security and disaster recovery.

Duplicate(Fork):

When you click the Duplicate (Fork) option in the backup table dropdown or button, a page will appear to help you duplicate your data analysis service. This process will create a new cluster from your backup fork. You can choose a restore point, selecting either the most recent backup or a specific date for the duplication.


Authorised IPs

In the Authorized IP tab, you can manage the IP addresses allowed to access your data analysis service. The table displays each IP address/mask along with a description. You can use the Edit option to modify the IP address or the Delete option to remove it from the list.

Create IP:

click Add on IP or IP block(CIDR) to create new IP address

Edit IP:

click Edit IP Address option from dropdown to edit IP address

Logs

In the Logs tab, you can monitor and manage your data analysis service by viewing the most recent events (logs) in near real-time. The retention period for these logs depends on your selected service plan.

Metrics

To help you track and manage your data analysis instance, view its key metrics and statistics below. The retention period for these metrics depends on your service solution.

Service integration

Service Integration refers to the process of connecting different data analysis services or applications to work together seamlessly, enabling efficient data sharing and coordinated actions across platforms. In the Service Integration tab, you can manage connections between various data analysis tools. The "Add an integration" option allows you to connect different services smoothly, while you also have the ability to delete existing integrations as needed.

Create service integration:

In the "Add Service Integration" modal, you can initiate the process of adding a new integration by selecting the integration type. You will need to choose a source service and a target service from the available options. Additionally, there are fields for specifying the index prefix and the maximum number of days for the index, starting at zero.

Advanced configuration

In the "Advanced Configuration" tab for data analysis, you can modify various settings to tailor your service to specific needs. This section allows you to adjust key options, such as logging configurations and data handling preferences. You can choose to enable or disable specific features to optimize the service’s performance. After making the necessary adjustments to the settings, clicking the "Update advanced configuration" button will save and apply the changes, ensuring the service operates according to your specifications.




Did this answer your question?