Month: September 2016

opvizor-performance-analyzer-part-9-performance-in-real-time-for-netapp

There is a best practices available as white paper for NetApp and VMware vSphere Storage. You can find the complete paper here. In the NetApp Community there a many interesting discussions, e.g. "Beginning this month, Tech OnTap highlights popular discussion threads appearing in the NetApp Technical Community. In this month’s discussion, Christopher Madden, NetApp IT …

opvizor-performance-analyzer-part-9-performance-in-real-time-for-netapp Read More »

opvizor-performance-analyzer-part-8-performance-in-real-time-for-mongodb

"Tens of thousands of organizations use MongoDB to build high-performance systems at scale, including over 30 of the world’s 100 largest organizations and many of the world’s most successful and innovative web companies. To achieve high performance with any database system requires an understanding of best practices. Get the guide that will inform you of …

opvizor-performance-analyzer-part-8-performance-in-real-time-for-mongodb Read More »

opvizor-performance-analyzer-part-7-performance-in-real-time-for-debian-linux

"Linux tuning information is scattered among many hundreds of sites, each with a little bit of knowledge. Virtual machine tuning information is equally scattered about. This is my attempt at indexing all of it," said Bob Plankers in his Blog. You can find his complete guide here. "In a default distro install, system logging is …

opvizor-performance-analyzer-part-7-performance-in-real-time-for-debian-linux Read More »

opvizors-performance-analyzer-3-is-certified-datacore-ready

Opvizor Inc. today announced that its Performance Analyzer 3 is certified DataCore Ready for SANsymphony ™, from the Software-Defined Storage Platform from DataCore™ Software, making them beneficial for organizations to maximize the availability and utilization of IT assets and to simplify data storage management. “It is a great pleasure to announce the DataCore Ready Solution certification for our …

opvizors-performance-analyzer-3-is-certified-datacore-ready Read More »

opvizor-performance-analyzer-part-6-performance-in-real-time-for-docker-container

Here are best practices for Docker container: "The documentation has been written specifically for the developers who are writing Dockerfiles to create images. Dockerfiles can become rather complex depending on the application being containerized so we are passing on our experience with Dockerfiles through a series of best practices. This guide of best practices is …

opvizor-performance-analyzer-part-6-performance-in-real-time-for-docker-container Read More »

opvizor-performance-analyzer-part-5-performance-in-real-time-for-ubuntu-linux

Things to do after installing Ubuntu 16.04 LTS (XENIAL XERUS), please read the complete article here. Here are 9 killer tips to speed up Ubuntu 14.04 The article was originally written for Ubuntu 13.10 but it is equally applicable for Ubuntu 14.04 and 15.04. If you already installed Ubuntu 16.04.x LTS Xenial Xerus, here is …

opvizor-performance-analyzer-part-5-performance-in-real-time-for-ubuntu-linux Read More »

your-one-size-fits-all-vmware-vsphere-dashboard-for-performance-and-monitoring

VMware vSphere Dashboard If you’re looking for a very modern way to check and monitor performance, you should give Performance Analyzer a try.  Monitor and Analyze VMware vSphere configuration and performance metrics with our VMware vSphere Dashboard that is the visual part of Performance Analyzer for VMware vSphere. Correlate events and metrics in guest applications …

your-one-size-fits-all-vmware-vsphere-dashboard-for-performance-and-monitoring Read More »

opvizor-performance-analyzer-part-4-performance-in-real-time-for-microsoft-hyper-v

"In Windows management, best practices are guidelines that are considered the ideal way, under normal circumstances, to configure a server as defined by experts. Best practice violations, even critical ones, might not necessarily cause problems. But, they can indicate server configurations that can result in poor performance, poor reliability, unexpected conflicts, increased security risks, or other potential …

opvizor-performance-analyzer-part-4-performance-in-real-time-for-microsoft-hyper-v Read More »

opvizor-performance-analyzer-part-1-performance-in-real-time-for-ibm-aix

Based on the IBM blog about using nmon command as a performance measurement or monitoring tool, this link is giving you in detail information: https://www.ibm.com/developerworks/aix/library/au-analyze_aix/ The mentioned blog entry covers: The nmon tool is designed for AIX and Linux performance specialists to use for monitoring and analyzing performance data, including: CPU utilization Memory use Kernel statistics …

opvizor-performance-analyzer-part-1-performance-in-real-time-for-ibm-aix Read More »

opvizor-performance-analyzer-part-3-performance-in-real-time-for-datacore-san-symphony

You can find a best practice guide here, and a complete technical marketing documentation here. The introduction from DataCore Software best practices: "DataCore has supported sending SCSI commands over an IP Network since 2001, first with our own STP driver until iSCSI became ratified and an iSCSI driver for Windows was available on Windows 2000 …

opvizor-performance-analyzer-part-3-performance-in-real-time-for-datacore-san-symphony Read More »

Use Case - Tamper-resistant Clinical Trials

Goal:

Blockchain PoCs were unsuccessful due to complexity and lack of developers.

Still the goal of data immutability as well as client verification is a crucial. Furthermore, the system needs to be easy to use and operate (allowing backup, maintenance windows aso.).

Implementation:

immudb is running in different datacenters across the globe. All clinical trial information is stored in immudb either as transactions or the pdf documents as a whole.

Having that single source of truth with versioned, timestamped, and cryptographically verifiable records, enables a whole new way of transparency and trust.

Use Case - Finance

Goal:

Store the source data, the decision and the rule base for financial support from governments timestamped, verifiable.

A very important functionality is the ability to compare the historic decision (based on the past rulebase) with the rulebase at a different date. Fully cryptographic verifiable Time Travel queries are required to be able to achieve that comparison.

Implementation:

While the source data, rulebase and the documented decision are stored in verifiable Blobs in immudb, the transaction is stored using the relational layer of immudb.

That allows the use of immudb’s time travel capabilities to retrieve verified historic data and recalculate with the most recent rulebase.

Use Case - eCommerce and NFT marketplace

Goal:

No matter if it’s an eCommerce platform or NFT marketplace, the goals are similar:

  • High amount of transactions (potentially millions a second)
  • Ability to read and write multiple records within one transaction
  • prevent overwrite or updates on transactions
  • comply with regulations (PCI, GDPR, …)


Implementation:

immudb is typically scaled out using Hyperscaler (i. e. AWS, Google Cloud, Microsoft Azure) distributed across the Globe. Auditors are also distributed to track the verification proof over time. Additionally, the shop or marketplace applications store immudb cryptographic state information. That high level of integrity and tamper-evidence while maintaining a very high transaction speed is key for companies to chose immudb.

Use Case - IoT Sensor Data

Goal:

IoT sensor data received by devices collecting environment data needs to be stored locally in a cryptographically verifiable manner until the data is transferred to a central datacenter. The data integrity needs to be verifiable at any given point in time and while in transit.

Implementation:

immudb runs embedded on the IoT device itself and is consistently audited by external probes. The data transfer to audit is minimal and works even with minimum bandwidth and unreliable connections.

Whenever the IoT devices are connected to a high bandwidth, the data transfer happens to a data center (large immudb deployment) and the source and destination date integrity is fully verified.

Use Case - DevOps Evidence

Goal:

CI/CD and application build logs need to be stored auditable and tamper-evident.
A very high Performance is required as the system should not slow down any build process.
Scalability is key as billions of artifacts are expected within the next years.
Next to a possibility of integrity validation, data needs to be retrievable by pipeline job id or digital asset checksum.

Implementation:

As part of the CI/CD audit functionality, data is stored within immudb using the Key/Value functionality. Key is either the CI/CD job id (i. e. Jenkins or GitLab) or the checksum of the resulting build or container image.

White Paper — Registration

We will also send you the research paper
via email.

CodeNotary — Webinar

White Paper — Registration

Please let us know where we can send the whitepaper on CodeNotary Trusted Software Supply Chain. 

Become a partner

Start Your Trial

Please enter contact information to receive an email with the virtual appliance download instructions.

Start Free Trial

Please enter contact information to receive an email with the free trial details.

Subscribe to our newsletter