Recently Microsoft announced the Power BI integration with Jupyter, Now we can tell data stories using Jupyter notebook.

As Microsoft launching new powerbiclient package we can export and embed Power BI reports, dashboards, dashboard tiles, report visuals or Q&A in Jupyter notebooks easily.

Power BI is a set of software services, apps, and connectors that work together to transform data from different data sources into logical, visually immersive, and interactive insights.

Our data could be in the form of an Excel spreadsheet or a collection of hybrid data warehouses that are both cloud-based and on-premises.

Power BI makes it simple…


Photo by DARYL WEBER

What is Sentiment Analysis?

The algorithms of sentiment analysis principally specialize in process opinions, attitudes, and even emoticons in an exceedingly corpus of texts. The vary of established sentiments considerably varies from one technique to a different. whereas a customary analyzer defines up to a few basic polar emotions (positive, negative, neutral), the limit of additional advanced models is broader.

TextBlob:

TextBlob is an open-source python library for processing textual data. it offers a simple API to access its methods and perform basic NLP tasks.
TextBlob performs different operations on textual data such as noun phrase extraction, sentiment analysis, classification, translation, etc.


Dynamic Programming (DP) is an algorithmic technique for solving an optimization problem by breaking it down into smaller sub-problems and taking advantage of the fact that the best solution to the overall problem is determined by the best solution to its sub-problems.

Dynamic Programming is primarily a recursion optimization. We can use Dynamic Programming to optimize any recursive solution that has repeated calls for the same inputs. The idea is to save the effects of sub-problems so that we don’t have to recalculate them later. The time complexity of this simple optimization is reduced from exponential to polynomial.

For example…


There are many useful search algorithms present which can fit different problems depending on the requirement we want to solve. They are divided in two main categories please find them as below:

There are different search algorithm which we can use when we have to identify different values in given problem or array.

· Linear Search

· Binary Search

· Jump Search

· Interpolation Search

· Exponential Search

· Sublist Search (Search a linked list in another list)

· Fibonacci Search

· The Ubiquitous Binary Search

· Recursive program to linearly search an element in a given array

· Recursive…


The normal distribution is a probability function that defines how the values of a variable are distributed. The normal distribution is a symmetric distribution where most of the observations cluster around the central peak and the probabilities for values further away from the mean taper off equally in both directions.

Normal distribution also called the bell curve or Gaussian distribution. Normal distribution is bell shaped and symmetric about a vertical line through its center. Mean, median and mode are all equal and located at the center of the distribution.

In below image we can see values are distributed along x…


Correlation is a term that is a measure of the strength of a linear relationship between two quantitative variables (e.g., height, weight).

Sometimes two or more events are interrelated, i.e., any change in an event may affect the other events. If such changes are expressed in the form of numerical data and they appear to be interdependent they are said to be correlated. For example, the weight of human body increases with the increase in height and age. Here the age and body weight are two separate characters but they are interdependent so they are correlated. …


· Inferential statistics is work with a random sample of data taken from a population to illustrate and make inferences about the population.

· Inferential statistics are valuable when working with of each member of an entire population is not convenient or possible.

· Inferential statistics help us get to the conclusions and make predictions based on our data.

· Inferential statistics understands the whole population from sample taken from it.

· In Inferential statistics we use a random sample, so we can generalize outcome from the sample to the large population.

· In Inferential statistics, we can calculate the…


Statistics:

Statistics is main contributor to multiple scientific research and analysis. Statistics uses quantitative knowledge to the create data collection schemes, process the data, analyze the data, and interpret the outcome. Further, statisticians often make critical calculations on the consistency of data and whether inferences drawn from can be made positively. Using statistics, the facts and figures that are collected, categorized, analyzed, and condensed for presentation and analysis.

Data may be categorized as either quantitative or qualitative.

Descriptive statistics:

· Descriptive statistics, is basically, describing and understanding the specification and characteristics of a sample data from the population.

· Descriptive statistics…


What is Project titan :

Project titan is a kind of secret project on which apple started working with internal and some external resources without announcing it publicly.

Starting in 2014, Apple began working on “Project Titan,” with upwards of 1,000 employees working on developing an electric vehicle at a secret location near its Cupertino headquarters.

Background:

We’re no strangers to surprises from Apple, the company has shocked us before. Start from the iPod to the new generation iPhone it has shocked its customers as well as the technology giants.

Apple has already made its brand value and the iPhone X…


Data preparation: -

Data preparation is the process of cleaning and transforming raw data prior to processing and analysis. It is an important step prior to processing and often involves reformatting data, making corrections to data and the combining of data sets to enrich data.

Data preparation is often a lengthy undertaking for data professionals or business users, but it is essential as a prerequisite to put data in context in order to turn it into insights and eliminate bias resulting from poor data quality.

The data preparation process usually includes standardizing data formats, enriching source data, and/or removing outliers.

Swapnil Bandgar

Immature coder, PG student @Loyalist, Toronto Canada.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store