KPI Partners Blog

Using WITH in Physical Tables ("Opaque Views")

Posted by KPI Partners News Team on Fri, May 24, 2024 @ 02:15 AM

by Kurt Wolff

Common practice is to use derived tables (sometimes referred to as “inline views”) when creating a physical layer “Select” table.

Read More

Tags: Kurt Wolff, Oracle BI, Blog

Evaluate With Date Functions in OBIEE

Posted by KPI Partners News Team on Fri, May 24, 2024 @ 01:56 AM

by Kurt Wolff

If you’re not concerned about database portability – for example, you use Oracle and that’s that, forever – then the evaluate function in OBIEE can be useful. However, using the evaluate function can be tricky, and the documentation could be better.

Read More

Tags: Kurt Wolff, Oracle BI, Blog

Looker Tips and Tricks - Hub and Spoke Implementation

Posted by KPI Partners News Team on Mon, Nov 22, 2021 @ 03:20 PM

by Vyshnavi. CR

Looker Tips and Tricks - Hub and Spoke Implementation

In a large organization when we have a huge number of users, the requirements are difficult to handle. Looker helps us provide a generic data model and allow the users for appending or eliminating the code. The Hub and Spoke implementation help us achieve this.

Read More

Tags: Blog, Looker, Spoke Implementation, Vyshnavi. CR, Looker Tips and Tricks, Hub and Spoke Implementation

Data Security in Snowflake

Posted by KPI Partners News Team on Mon, Aug 09, 2021 @ 01:28 PM

by Ramana Kumar Gunti

Data Security in Snowflake

Data security is probably the number one topic on everyone's head when it comes to moving your data into the cloud. This is especially true if you are new to the cloud and don't have a seasoned technical team that understands cloud security. In this article, we will talk about how does Snowflake help? So let's get started.

While the world is moving towards Cloud Computing, Cloud Storage, and other Cloud Technologies which run on Public networks, it is important to secure and safeguard the data.

To achieve data security, Snowflake provides a handful of methodologies.

  • Data Encryption
  • RLS – Row Level Security
  • CLS – Column Level Security- Data Masking
Read More

Tags: Blog, Snowflake, Data Security in Snowflake

Data Security Snowflake Part 2 - Column Level Security

Posted by KPI Partners News Team on Fri, Dec 04, 2020 @ 07:59 AM

by Ramana Kumar Gunti

Data Security in Snowflake using CLS

Large companies and professional businesses have to make sure that data is kept secure based on the roles and responsibilities of the users who are trying to access the data. One way to protect data is to enforce “Column Level Security” (CLS) to ensure that people can only access what they are supposed to see.

In this article, We will show how the Column Level Security can be implemented by using one of the Snowflake feature Data Masking Policy. So, Let’s get started!!

Read More

Tags: Blog, Snowflake, Data Security in Snowflake, Row Level Security

Data Security Snowflake Part 1 - Row Level Security

Posted by KPI Partners News Team on Fri, Dec 04, 2020 @ 07:58 AM

by Ramana Kumar Gunti

Data Security in Snowflake using RLS

Large companies and professional businesses have to make sure that data is kept secure based on the roles and responsibilities of the users who are trying to access the data. One way to protect data is to enforce “Row Level Security” (RLS) to ensure that people can only access what they are supposed to see.

Snowflake working on fine-tuning the Row Access policies for general availability. In this article, we will show an approach to restrict the Row Level Data to Authorized users by using the Snowflake Functions CURRENT_ROLE , CURRENT_USER

So, Let’s get started!!

Read More

Tags: Blog, Snowflake, Ramana Kumar Gunti, Data Security Snowflake, Column Level Security

Data Sharing in Snowflake

Posted by KPI Partners News Team on Tue, Nov 10, 2020 @ 03:22 AM

by Ramana Kumar Gunti

Data Sharing in Snowflake

In this article, we will talk about Snowflake data sharing which enables account-to-account sharing of data through Snowflake database tables, secure views, and secure UDFs. So let's get started.

Snowflake data sharing is a powerful yet simple feature to share the data from one account and to use the shared data from another account. The data producer can provide access to his live data within minutes without copying or moving the data to any number of data consumers. The data consumer can query the shared data from data producer without any performance bottle necks thanks to snowflakes multi-cluster shared data architecture.

Read More

Tags: Oracle Data Integrator (ODI), Blog, Snowflake, Ramana Kumar Gunti

Oracle Data Integrator with Snowflake

Posted by KPI Partners News Team on Thu, Sep 24, 2020 @ 07:21 PM

by Abdul Mathin Shaik

Oracle Data Integrator with Snowflake

In this article, we will talk about how to load the data into the Snowflake data warehouse using Oracle Data Integrator ODI. So let's get started.

Lately, one of our Snowflake customers was looking to use Oracle Data Integrator to load data into Snowflake from oracle DW. I have been working with Oracle applications such as ODI for years now, so I wanted to explore how to configure ODI to work with Snowflake. 

Read More

Tags: Oracle Data Integrator (ODI), Blog, Snowflake, ODI with Snowflake, Abdul Mathin Shaik

Manual performance optimization in Denodo – Most relevant and most used techniques

Posted by KPI Partners News Team on Mon, Aug 24, 2020 @ 07:46 AM

by Manju Das

Manual performance optimization in Denodo – Most relevant and most used techniques

We are going to discuss how to optimize the Denodo query to make the most of the optimization in the Denodo platform. Below are the most powerful query optimization techniques that we can apply in Denodo platform manually which will increase the performance significantly.

Read More

Tags: Blog, Manju Das, Denodo, Most relevant and most used techniques, Manual performance optimization in Denodo

BigQuery Best Practices to Optimize Cost and Performance

Posted by KPI Partners News Team on Mon, Aug 24, 2020 @ 07:46 AM

by Pushkar Reddipalli

BigQuery Best Practices to Optimize Cost and Performance

BigQuery is Google's fully managed, petabyte scale, low cost analytics data warehouse. It allows you to execute terabyte-scale queries, get results in seconds, and gives you the benefits of being a fully managed solution. BigQuery and other Google Cloud Platform (GCP) services offer "pay per use" pricing. If you know how to write SQL Queries, you already know how to query it. In fact, there are plenty of interesting public data sets shared in BigQuery, ready to be queried by you.

In this blog post, I will guide you through best practices to implement Google BigQuery to control costs and optimize performance regardless of the level of development experience you have. To start working with BigQuery, you must first upload your data, which means that you will also use it as storage, then verify this data in some way to create your models or create reports. In the next section, we will analyze best practices for these main actions of loading, querying and storing.

Read More

Tags: Blog, Data Warehouse, BigQuery, BigQuery to optimize cost and performance, Pushkar Reddipalli, BigQuery Best Practices