Filters
Apply

Filter By:

Author

Types of Articles

Entity

Search results:

tags Data Warehouse
1-10 of 168 results

Snowflake Data Warehouse Solutions

While there are many approaches to cloud Data, Accion Data Labs, can help you quickly implement and gain excellent ROI from the Snowflake Data Warehouse to access, integrate, and analyze Data with near-infinite scalability enabled automatically or on the fly.

How to Structure Content Using HTML5 Semantic Tags

Dinanath Jayaswal

The article provides an overview of the latest meaningful HTML 5 semantic tags like header, section, footer, article, aside, nav, and accessibility WAI-ARIA. Use of these semantic tags can modernize your web pages and significantly improve search engines interactions.

Sarika Jagannath Nikam

I am a MSBI & IBM technology professional with strong experience in the area of data warehouses, queries and reports. I have worked on several projects like ERP, healthcare, CRM, banking & finance.

My technology expertise of more than 10 years includes Microsoft SQL technologies, E-commers, data governance, data warehousing, and IBM Cognos

Cloud Data Warehouse

Understand how Accion data Labs can help you cost-effectively and quickly use Snowflake to drive rapid insights with a scalable, modern data analytics solution in the cloud.

Ramesh Babu Vegi

Having around 7+ years of progressive experience in BI Industry spanning across product development & application development  with expertise in Oracle SQL, multiple ETL and ELT tools (Oracle Data Integrator, Oracle Warehouse Builder, Informatica)& Reporting Tool(OBIEE,Qlikview) and got trained on Tableau(Reporting) .

Abhishek Singh

Over 1.4 years of Total IT professional experience in Big data and data warehousing (ETL/ELT) technologies

includes requirements gathering, data analysis, design, development, system integration testing, deployments

and documentation.

● Research & develop Machine Learning models for Defense.

● Research Models for human detection in any climate circumstances.

● Provide SW specifications, production quality code and engage with algorithm proliferation activities

● Research and develop machine learning algorithms for drone night vision human detection.

● Hands on experience in solutions for Big data using Hadoop, HDFS, Map Reduce, Spark, PIG, Hive, Kafka, Sqoop,

Zoo keeper, Flume, Oozie.

● Excellent knowledge and hands on experience of Hadoop architecture and various components such as HDFS, Job

Tracker, Task Tracker, Name Node, data Node and Mapreduce programming paradigm and monitoring systems.

● Hands on experience in installing, configuring, and using Hadoop ecosystem components and management.

● Experience in importing and exporting data using Sqoop from HDFS/Hive to Relational database Systems and

vice - versa.

● Experienced and well versed in writing and using UDFs in both Hive and PIG using scala

● Excellent understanding with different storage concepts like block storage, object storage, column storage,

compression storage.

● Extensive experience in Extraction, Transformation <<>> Loading (ETL and ELT) data from various sources into

data Warehouses and data marts with industry best practices.

● Experience with Informatica ETL for data movement, applying data transformations and data loads.

● Good working experience with different Relational DB systems.

● Very good understanding with implementations in building data warehousing and data marts with OLTP vs OLAP,

star vs snowflake schema, normalization vs denormalization methods.

● Hands on experience in building wrapper shell scripts and analysis shell commands in practice.

● Supported various reporting teams and experience with data visualization tool Tableau.

● Very good at SQL, data analysis, unit testing, debugging data quality issues.

Information Lifecycle Management for large data warehouses

Aswani Karteek Yadavilli

Jim Barksdale, former Netscape CEO once said, “If we have data, let’s look at data. If all we have are opinions, let’s go with mine.”

Facebook open sources realtime big data search with Presto

Ashutosh Bijoor

At a conference for developers at Facebook headquarters on Thursday, engineers working for the social networking giant revealed that it’s using a new homemade query engine called Presto to do fast interactive analysis on its already enormous 250-petabyte-and-growing data warehouse.

What Are the New Data Lake Patterns

The concept of a traditional Data warehouse is a very efficient one. Actually, it is so efficient that we have started using it to do all kinds of analytics. When we encounter challenging situations like changes to schema, excessive volumes and difficult identity resolution situations, the traditional approach falls short of expectations. An alternative approach, leveraging the concept of the Data lake, referred to as “The Data Lake Pattern” has gained a lot of momentum.

Siva Kalyan Karpurapu

A Certified Informatica Developer, Certified Data warehousing specialist, Informatica, Datastage, OBIEE, Pentaho BI etc.

By clicking “Accept all cookies,” you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. Privacy policy

Contact Us