Skyline.DataMiner.Utils.QAPortalAPI 1.0.13

Prefix Reserved
There is a newer version of this package available.
See the version list below for details.
dotnet add package Skyline.DataMiner.Utils.QAPortalAPI --version 1.0.13                
NuGet\Install-Package Skyline.DataMiner.Utils.QAPortalAPI -Version 1.0.13                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Skyline.DataMiner.Utils.QAPortalAPI" Version="1.0.13" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Skyline.DataMiner.Utils.QAPortalAPI --version 1.0.13                
#r "nuget: Skyline.DataMiner.Utils.QAPortalAPI, 1.0.13"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Skyline.DataMiner.Utils.QAPortalAPI as a Cake Addin
#addin nuget:?package=Skyline.DataMiner.Utils.QAPortalAPI&version=1.0.13

// Install Skyline.DataMiner.Utils.QAPortalAPI as a Cake Tool
#tool nuget:?package=Skyline.DataMiner.Utils.QAPortalAPI&version=1.0.13                

Skyline.DataMiner.Utils.QAPortalAPI

DataMiner CICD NuGet Solution

About

Skyline.DataMiner.Utils.QAPortalAPI

Skyline.DataMiner.Utils.QAPortalAPI was created to easily push test results in a standardized way to the QAPortal. This portal contains all test information and results.

Skyline.DataMiner.Utils.QAPortalTool

This is a .NET Tool that can be executed from command-line utilities.

About DataMiner

DataMiner is a transformational platform that provides vendor-independent control and monitoring of devices and services. Out of the box and by design, it addresses key challenges such as security, complexity, multi-cloud, and much more. It has a pronounced open architecture and powerful capabilities enabling users to evolve easily and continuously.

The foundation of DataMiner is its powerful and versatile data acquisition and control layer. With DataMiner, there are no restrictions to what data users can access. Data sources may reside on premises, in the cloud, or in a hybrid setup.

A unique catalog of 7000+ connectors already exist. In addition, you can leverage DataMiner Development Packages to build you own connectors (also known as "protocols" or "drivers").

Note See also: About DataMiner

About Skyline Communications

At Skyline Communications, we deal in world-class solutions that are deployed by leading companies around the globe. Check out our proven track record and see how we make our customers' lives easier by empowering them to take their operations to the next level.

Getting Started

Include the nuget package in your project, and deploy the Skyline.Dataminer.Utils.QAPortalAPI.dll on the Dataminer system (In the future the Dataminer deployment action will be automated).

Add the following using statements to your code

using QAPortalAPI.APIHelper;
using QAPortalAPI.Enums;
using QAPortalAPI.Models.ReportingModels;

The first thing that needs to happen in the script/test is to initialize the TestReport object. This object will contain all information about the test and the system, this will be sent to the portal.

The Dataminer version can be automatically retrieved via this structure

TestReport testReport = new TestReport(
    new TestInfo("AUniqueTestName", "ASkylineDomainOrEmail", new List<int> {99999}, "The description of a Test"),
    new TestSystemInfo("[AgentNameOnThePortal]"));
  • The TestInfo object contains all information about the test, this information will be used on the portal.
  • The portal will automatically add the customer abbreviation to the testname, this to make sure individual tests are unique over all customers.
  • The TestSystemInfo object contains information about the system where the test was executed on. This is used to link the results to the correct system and dataminer version.
  • Make sure to provide the name of the Agent where the test is running on, not the clusterName.

Next, we want to instantiate an object from the QAPortalAPIHelper. You can achieve this with the following code:

//The url or email to the portal provided when the system is added to the QAPortal. 
string portalUrl = "publicPortalUrlOrEmail";
QAPortalAPIHelper reportHelperViaAPI = new QAPortalAPIHelper(engine.GenerateInformation, portalUrl, "SafelyStoredClientID", "SafelyStorredAPIKey");

QAPortalAPIHelper reportHelperViaAPIAndProxy = new QAPortalAPIHelper(engine.GenerateInformation, portalUrl, "SafelyStoredClientID", "SafelyStorredAPIKey", "[AProxy]");

QAPortalAPIHelper reportHelperViaEmail = new QAPortalAPIHelper(engine.GenerateInformation, portalEmail, "SafelyStoredClientID", "SafelyStorredAPIKey", engine.SendEmail);

Now the actual script/test can start running, each time a testcase is completed the result has to be added to the report object.

//Add a success testcase to the result
testReport.TryAddTestCase(new TestCaseReport("AUniqueTestCaseName", Result.Success, string.Empty), out _);

//Add a failure testcase to the result
testReport.TryAddTestCase(new TestCaseReport("AUniqueTestCaseName2", Result.Failure, "The Fail message"), out _);

Once all testcases are executed and saved in the report, we let QAPortal know that the regression run has stopped, if the run has failed and with a comment:

//Regression run hasn't failed due the false parameter
reportHelperViaAPI.StopRegressionRun("AClusterName", false, "RegressionRun hasn't failed");

//Regression run has failed, notice the true parameter
reportHelperViaAPI.StopRegressionRun("AClusterName", true, "RegressionRun has failed");

Next we have to send the result to the QAPortal via the PostResult method.

This can be done via 2 ways

  • Directly to the API (internet access needed)
  • By email (an action has to be provided that can send an email)
//At the end of the test as shown in the examples mentioned below, post the result using PostResult
reportHelperViaAPI.PostResult(testReport);

//At the end of the test as shown in the examples mentioned below, post the result using PostResult(this is when using a proxy)
reportHelperViaAPI.PostResult(testReport);

//At the end of the test as shown in the examples mentioned below, post the result using PostResult(this is when using an e-mail)
reportHelperViaEmail.PostResult(testReport);    

See also the full Dataminer script examples: via API or via Email

Warning When using the Skyline.DataMiner.Utils.QAPortalAPI on versions lower than 10.1.11 or 10.2.0.0, add the netstandard.dll reference to your script.

Note The portalUrl, ClientID and API key will be provided by the contact at Skyline Communications.

Performance Results

Via the API it is also possible to publish performance results to the portal. This can be done by adding the performance results to your test report

//Performance results
testReport.PerformanceTestCases.Add(new PerformanceTestCaseReport("APerfName1", Result.Success, "SomeInfo About the test case and the value", ResultUnit.Second, 99));

testReport.PerformanceTestCases.Add(new PerformanceTestCaseReport("AUniqueTestCaseName1", Result.Failure, "SomeInfo About the test case and the value", ResultUnit.Second, 9));

You can either have dedicated performance test cases or functional tests cases that also push a performance result.

Start and stop Regression Runs

You can report the start or stop of a regression for a test system towards the QAPortal. With a regression run the results are grouped together, for each run we save statistics (total tests/failures/...). It will also send out an email with the results when the regression run stops (currently only internally).

This can be done via the below methods:

Start Regression Run

Starts a regression run for a test system. If their is already a run active it will stop that run, and create a new one.

Accepts the clustername as a string parameter.

//Via API
QAPortalAPIHelper reportHelperViaAPI = new QAPortalAPIHelper(engine.GenerateInformation, QAPORTALURL, ClientID, APIKey);
reportHelperViaAPI.StartRegressionRun("AClusterName");

//Via email
QAPortalAPIHelper reportHelperViaEMail = new QAPortalAPIHelper(engine.GenerateInformation, QAPORTALURL, ClientID, APIKey, plainBodyMail);
reportHelperViaEMail.StartRegressionRun("AClusterName");

Stop Regression Run

Stops an active regression run for a system.

You have to provide the following parameters:

  • ClusterName as string.
  • If the regression run was a success or not as boolean. true mains that the regression has failed, false means it was successful.
  • A comment as string.
QAPortalAPIHelper reportHelperViaAPI = new QAPortalAPIHelper(engine.GenerateInformation, QAPORTALURL, ClientID, APIKey);

 //regression run has failed
reportHelperViaAPI.StopRegressionRun("AClusterName", true, "RegressionRun has failed");

// or regression run was successful
reportHelperViaAPI.StopRegressionRun("AClusterName", false, "RegressionRun was a success");


QAPortalAPIHelper reportHelperViaEMail = new QAPortalAPIHelper(engine.GenerateInformation, QAPORTALURL, ClientID, APIKey, plainBodyMail);
//regression run has failed
reportHelperViaEMail.StopRegressionRun("AClusterName", true, "RegressionRun has failed");

// or regression run was successful
reportHelperViaEMail.StopRegressionRun("AClusterName", false, "RegressionRun was a success");

See also examples:

Note

  • When a startregression is send while the portal still has an active regression run, it will stop the one that was active and start a new run.
  • When a stopregression comes in while the portal doesn't have an active regression run it will ignore this request.

Debugging and Diagnostic

No test-case results error

Issue: When posting to the portal you get an error message: No test-case results are present on the test report.
Fix: The report always expects at least 1 testcase result (success or failure).

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 is compatible.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 is compatible.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
1.0.14 2,061 4/26/2024
1.0.13 4,184 10/17/2023
1.0.12 296 10/4/2023
1.0.11 160 9/28/2023
1.0.10 779 8/21/2023
1.0.10-AlphaTest 125 8/9/2023
1.0.9 1,143 4/6/2023
1.0.8 428 3/8/2023
1.0.7 421 2/17/2023
1.0.5 306 2/7/2023
1.0.1 257 2/7/2023
1.0.0 324 2/1/2023