Odoo v9 Community vs Odoo v9 Enterprise

Odoo 9 New Features
  • by bista-admin
  • Jun 20, 2016
  • 0
  • Category:

Odoo is a fully-functional, modular structure system for Enterprise Resource Planning. Technically, Odoo Enterprise is Odoo Community with some additional functional modules. The Odoo community is the Core system which consists of all the databases and core applications and some basic functionality.Odoo Enterprise edition is enabled with the support of migrating to next versions by the Odoo engineers ,in addition to which Odoo Enterprise edition also allows you to install Apps from the Odoo Store .However it is very important to note that the Odoo Enterprise version is not open source , which means the code is made available only to Official Odoo partner and Odoo itself, no other partner is allowed to sell the Enterprise Edition.

In the previous versions of Odoo(6,7,8), there was no evident difference in technical or functional aspects in both the editions (community and enterprise) of Odoo. However the new Odoo V9 has brought about many changes.Here are the major differences between community and enterprise editions put in a graphical representation, this let’s you understand which version has to be chosen based on the requirements.

enterprise-funtionality

Specialisation

Now let us consider various characteristics on the basis of which we provide a tabular differentiation between Odoo Enterprise Edition and Odoo Community Edition in Odoo V9.

table-odoo1

Features

In this section, we will understand the major feature difference between the community and enterprise editions of Odoo v9. Let’s see one by one section wise.

Planner Facility:

The planner is the new facility for all odoo users which is available in enterprise edition only.The planner is a status of your apps, a planner is located at top of the menu bar and can be opened by just clicking on the status bar.Given below is a planner snapshot which is set as mark done 15%.

your-customers

Finance/Accounting Management :

Features In Enterprise,

Accounting changes is a pretty big game changer in Odoo 9, and includes the following:

  • Basic Accounting

  • No need to create fiscal year : just need to define dates from account configuration

  • Lock Entries : Lock on the invoice if lock date is set

  • Dynamic Reports & Executive Summary

  • Legal Statements

  • Bank Interfaces (US,NZ & Canada)

  • Reconcile payment against pending invoice from dashboard

  • Statement Import : Coda,OFX,QIF

  • Internal transfer : eg. bank to cash in easy way

  • Check Printing & Deposit

  • Customer Follow-ups

Features In Community,

It has the basic accounting features available which is enough for ordinary accounts managements with features like:

  • Accounting Dashboards

  • Check Printing and Deposits

  • Bank Reconciliations

  • General Ledger Maintenance

Accounting Reports

Here is a snapshot of how an accounts report looks like:

accounting

Project Management :

The project management module in Odoo deals with organise,scheduling,planning,developing and analysing the enter project. It also enables the checking of availability of the resources and their allocation to the project.Let us now understand the variations in Enterprise and Community versions of project management module of Odoo v9:

Features In Enterprise,

  • Tasks

  • Issues

  • Timesheets

  • Timesheets Chrome & Mobile App

  • Forecasts

  • Gantt Charts

Features In Community,

  • Tasks

  • Issues

  • Timesheets

Odoo v9 Timesheets

Here is how the timesheet activities look like in Odoo v9 :

time-sheet

Sales and CRM Management :

The Odoo Sales module deals with management of all the Quotations,Opportunities,Product Lines and so . The odoo Sales module is also integrated with well known shipping services such as FedEx,Usp,DHL and more. The Sales operations are made very easy in Odoo and include the following:

Features In Enterprise,

  • Sales

  • CRM

  • VoIP Integration

  • Customer Portal

  • Signature

  • Subscriptions

Features In Community,

  • Sales

  • CRM

Smart Sales Dashboard

  • Great sales journeys start with Sales Dashboard,

  • Faster user interface designed for sales.

  • All the information you need where you need it.

  • Send quotes in just a few clicks, manage your pipeline with drag & drop, etc.

  • Full overview of your personal activities, next actions, and performances

dashboard1

Inventory Management:

The Inventory Management helps in organizing the stocks in the warehouse, managing deliveries on time , managing back orders, managing transfers and so on. Odoo automatically calculates the shipping price, validates the customer addresses, prints shipping labels and cancellation of orders.This module is integrated with Manufacturing, Sales, nd Purchase which keeps everyone together.

Features In Enterprise,

  • Inventory Management

  • MRP Management

  • Purchase Management

  • Barcode Support ( Barcode Scanner connected to the system)

  • Integrated Shipping service with : UPS, FedEx,DHL,USPS,Temando

  • Multi-Company Flows

Features In Community,

  • Inventory Management

  • MRP

  • Purchase Management

Website Builder :

The Odoo Website Builder Module helps you build attractive websites with better ranking on the search engines.It has great marketing tools which inturn attract users to the website.The website builder helps build websites which can be viewed easily by all users irrespective of the device they are using.The Website Builder also manages all the SEO related activities of the website.Odoo eComerce websites are integrated with shipping services like FedEx,UPS,DHL and so on and in addition to this the products from Odoo eCommerce websites can be sold on Amazon and Ebay too.

Features In Enterprise,

  • Website Builder

  • Blogs

  • Presentations

  • Themes ( Free )

  • Form Builder

  • Call-to-Actions Blocks

  • Versioning

  • A/B Testing

Features In Community,

  • Website Builder

  • Blogs

  • Presentations

  • Themes ( Free )

Marketing Management:

The Odoo Marketing Module helps the user in managing all the Marketing aspects of a Company, which includes building great campaigns, sending mass mails to the customers, calculate the ROI per customer, Tracking your links, develop various content management plans and so on.

Features In Enterprise,

  • Events
  • Expenses
  • Email Marketing
  • Live Chat
  • Lead Scoring
  • Email Marketing Templates

Features In Community,

  • Events
  • Expenses
  • Email Marketing
  • Live Chat

Digital Signatures Documents :

The Odoo eSignature features helps in sharing various documents online on a single click.This is a easiest way to upload documents, verify and Sign and avoids the manual pain of printing and scanning various documents.This is fast and uses no papers.It provides user friendly drag and drop functionality for adding fields in the document like adding Initials(Name), Signature, Date,Email and also allows you to specify who is supposed to fill up each field in the document and this can be sent out to multiple people at the same time. You can also manage and track document in a easy way.Odoo eSignature is also cent percent secure.

Here is a snapshot of adding fields to a document:

digital-dashboard

Reference Link : https://www.odoo.com/editions

These were the main highlights of the Odoo v9 Community vs Odoo v9 Enterprise. If your looking out for more insights on the various Odoo functionalities and modules you can get in touch with us on sales@bistasolutions.com

Also do let us know what you think about this blog at feedback@bistasolutions.com

When testing gets better, business runs smoother

Good Testing Makes Business run Smoother

Agile Testing

 When it comes to testing, the question that arises is why do executives see testing as an Epic Fail? According to software testing expert Scott Barber, it all comes down to accounting. “When you look at the accounting spreadsheet,” Scott says, “testing is a cost center, not a profit center “. This is how executives look at testing; they don’t care about testing; they are only concerned about what kind of value you bring to the product. Indeed, a good quality product sells better — but this is only till you reach a point of minimizing returns. The Value of any product lies in, shorter time in market, error-free software, conform to customer requirements and compliant to standards.

But, what brings you this value and better quality software? Testing does! And to achieve it on a large scale, Agile Testing comes in the picture. Agile Testing ensures that your product doesn’t affect interacting systems in a negative manner.Instead, agile software development and Agile testing help in encouraging repeated sales and gain customer’s trust.

Business Driven Test Management (BDTM)

In today’s rapidly growing business world, companies are now focused more and more on achieving maximum business value from their product, service or software. As a result, high-speed software delivery and high quality are even more important. Hence the likelihood of deficient software quality even greater. Here is where Business Driven Test Management (BDTM) comes into the picture. BDTM acts a guide and demonstrates how to organize, manage and execute a test process.

It converts an organization’s business goals into test goals, allowing a client to more effectively control the test process and consequently, the results of testing. Depending on rational and economic business considerations and identified risks, the right components are tested. And during the process, there is a strong focus on clear and effective client communication.

BTDM Process

The BTDM process includes a certain set of repetitive steps to be followed, We describe them here with an iterative process diagram below,

1. Prepare assignments and Identify the test goals.

2.Understand the risk class for each combination of characteristic and object part.

3. Decide if a combination of characteristic and object part have to be tested thoroughly or ‘lightly’.

4. The fourth step is to Estimate and plan test execution.

5. Assign most appropriate test design techniques.

6.Give the client and other stakeholders of the project/product appropriate insights into the test processes and test objects throughout the test execution process and keep track of all the change requests through proper documentations

Also in the evolution of the business and technology today, organizations are under immense pressure to remain competitive in the market. They need to have reliable and

pic1

efficient systems that are capable enough to support the complex business processes.In addition to this organizations need to grow their competencies at a faster pace.To achieve all these points business leaders today are aiming at developing Testing center of excellence(TCoE).The TCoE is command center which provides the standardized methodology, best practices of testing, metrics and automation tools and ensures high quality of the product during the developmental cycle as well as during the deployment process.Some of the listed Advantages of TCoE would be: Greater Agility, Cost efficient, Better Quality, and Faster Releases.The TCoE also can be phased like the BTDM process, the following would be some of the important phases in your path:

1.Establishment of standards and policies and basic governance measure and policies of the Testing methodologies.

2.Determine the product and test infrastructure and standardize your testing tools, which will consequently consolidate the cost of procuring the testing tools.

3.Determine the Service Utility, which means TCoE will act as a source of expertise to the whole organization.

4.Last but not the least the Quality innovation in-charges will provide everything for a centralized testing environment, which includes resources, tools, management and governance of all applications and various business processes.

pic2

Conclusions:

These unique business-driven testing services approach will enable your clients to achieve the highest quality application deployments, with less cost, less risk and faster time to market. And so goes the phrase, “When testing gets better, business runs smoother.”

Tell us what you think about this blog on feedback@bistasolutions.com. Feel free to get in touch with us through sales@bistasolutions.com for more insights.

Top Features of Power BI

As a complete Self-service BI Tool, Power BI stands out with it’s unique array of features that facilitate developing truly interactive BI Dashboards. One can build interactive dashboards using varied data sources in minutes. The dashboards are accessible in app-based Power BI service platform, where the user can view, drill-down & apply report filters & even download them. Power BI Suite comes with some unique features in the BI Analytics space

Power BI suite

Power BI suite encapsulates features of PowerPivot, Power Query and Power Maps to provide a comprehensive solution for BI Reporting & Analytics Reporting. One can easily build a pivot table summarizing the data. With Power Query, one can combine, and refine data across a wide variety of sources including traditional data sources, relational, structured and semi-structured, Web, Hadoop, Azure Marketplace, & others. Power Query also allows one to search for public data from sources such as Wikipedia. With Power Map, one can easily visualize data split by geographies. Also, Power BI can be quickly integrated with Microsoft Excel through Excel-Add Ins. So one can also publish a Dashboard Report made in Excel in just 1 click.

Importing data from multiple sources

Data can be imported from sources ranging from conventional relational databases to varied data source platforms. Data can be imported from Facebook, Sharepoint Online list, SalesForce, Google Analytics, Microsoft Azure data warehouse, Hadoop Distributed File System (HDFS) and much more

One can instantly create the website usage Dashboard by connecting the google analytics provided in Online Services of Power BI

importing-data

Visualizations

Combo charts, treemaps, fill maps, gauges and funnel charts provide customers with more ways to view their data in Power BI.

Power View enables for ad hoc, self-service data visualization, and Power Maps. It enables users to view data containing almost all geographical attributes  in a 3D-rendered Bing Maps environment.

visualization

Natural-language search technology

Natural-language search technology, helps users ask questions of their data by typing into a dialog box. The system then understands and provides answers in the form of interactive tables, charts, and graphs. It also allows one to ask questions about the data & get answers in visuals.

With Natural-language search, one can just for the search for something & get macro-level insights represented by interactive dashboards quickly in minutes. So to get counts split in various geographies one can search something like Olympic medals by Country, Unemployment Rate(US), Housing Prices by cities & many more questions.

Power BI Advantages & Features

To summarize, below are some the unique features of Power BI

  • Reports are deployable on multiple platforms like web, mobile apps, tabs. It allows you to choose device of your choice without worrying about the database platform

  • NLP is a unique feature which helps to create reports with data from online services like Bing Maps, Google Analytics

  • Power BI is capable of independently handling almost all the Data extracting & Data summarizing functions which ANSI-SQL scripting & Excel provides. So no need for SQL scripts for extracting & summarizing data

  • Visualizations facilitate drill-down & drill-up & get data & even export data as per the hierarchy.

  • Easy integration with Excel Data source including charts, pivot tables, pivot charts. Excel Dashboard can be pinned & published to Power BI service

  • Text Searches gives all possible answers about the data in Power BI service. Power BI identifies objects by the keywords typed in provides the output in form of visualization

  • Calculated columns & measures can be incorporated in the loaded Dataset independent of the Data source. Also, Queries parameterized Queries & filters can be applied on the Dataset level without any effect on the Data Source.

  • Separate Relationships interface with auto-detect relationships IntelliSense.

  • DAX functions supported. Also, the UI has a great intelligence of identifying the tables & field names as you type in.

  • Facilitates incorporating Interactive visuals. Also one can import visuals from Visuals gallery, which is a free online collection of visuals & can quickly incorporate it in the report.

  • Lastly, pricing is per user/month is like pay-as-you-go, which is very aggressive. For more details on the pricing please click Power BI pricing.

  • Power BI users also have access to online community where they can ask questions & raise tickets & the official website has blog & great resources for self-learning

Contact us at feedback@bistasolutions.com. For a free evaluation of how big data can be leveraged to provide you a competitive advantage.

Supply Chain Optimization using Big Data

As we witness a pivotal change in the way big data is revolutionizing and redefining all aspects of our lives, it becomes increasingly necessary for professionals from all domains to think radically on its application in their industries. The inventions around the Hadoop ecosystem has enabled ground-breaking technologies from driverless cars to intelligent assistants like Siri. It is not surprising, that the crucially important field of supply chain optimization, is ripe for a major breakthrough in how it has been approached until today.

side1

Traditionally, procurement has been planned around either predefined reorder points triggering a procurement request, or around fixed forecasting period using safety stock and average sales forecast. The problem with this approach was that there was no feedback loop to react in real time as business scenarios changed. This lead to either a “lost opportunity” in terms of not having the right inventory or the right price, or “dead stock” due to wrong stocking or purchasing decision. 

 

 

supply-chain-process-challenges

This problem of not being agile and responsive to the events occurring in the marketplace can be addressed by using big data technologies. The process starts by dividing the various steps involved in supply chain automation into multiple operational windows. This facilitates the prioritizations of various decisions based on how frequently they need to be evaluated. The results of each phase in the process feeds into the decisions of the next process thus creating a positive feedback loop which makes the entire process more responsive to external events.

The process starts with Strategic planning which involves the high-level analytics process in Hadoop to baseline the data. In this we automatically calculate the various parameters which impacts the supply chain decision process. This process will generally be an iterative process, run on a quarterly or monthly schedule, based on the type of business. The metrics from previous period will feed into this process and the performance of various parameters is evaluated and tweaked accordingly.

The next phase involves tactical decisions making, where various decisions regarding procurement and transfers are made based on the parameters and demand forecast. In this phase decisions related to what to buy, when and from which vendor are made. The decisions on how to stock a multi-echelon distribution network is also made in this step.

After this step, the next phase involves continuous evaluation of the performance of the supply chain and making tweaks to the inventory placement, the price at which to sell etc. These techniques of near real-time decisions are also referred to as “Demand Sensing”.

optimization-cycle

The Details

Strategic Planning:

  • Inventory categorization: In this part, various methodologies of categorization of Inventory is used. This includes FNS classification, Order frequency analysis, Price sensitivity analysis.
  • Multivariate clustering: The various parameters which influences the demand are then automatically evaluated by creating clusters using techniques like Principal Component Analysis and other clustering models.
  • Determining best-fit algorithm: Each Item in the inventory has a different demand pattern, it could have a trend, seasonality etc. The model which will be the best to forecast the demand would vary for Items in different clusters. The best model is identified and stored for forecasting.
  • Multi-echelon network calculation: If the company has multiple warehouses which form a part of the distribution network, we need to determine the best strategy of roll-up and aggregation for each Item in the network.
  • Supply chain parameters: The various parameters which influence the procurement and transfers are calculated based on the demand pattern and historical receiving performance.

Tactical Planning:

  • Demand forecast: The demand forecast for the various Items in the inventory for the selected period. The best-fit algorithm and clusters determined in the Strategic planning process is used to calculate the forecast.
  • Procurement plan: The projected demand and the forecasted inventory position in the period is used to calculate the procurement plan. The historical performance of the vendor is used to determine the date of order and the quantities. The EoQ, Safety Stock and other inventory parameters are used to create the procurement plan for the period.
  • Inventory transfers: For a distribution network, the stock placements at various locations are calculated and the transfers are created.

Demand Sensing:

  • The most crucial aspect of the big data architecture is the ability to respond to changes in the actual sales and adapt the strategy to it.
  • The “Lost sales” can be tracked and compared against the forecasted sales to evaluate a under or over-demand scenario. If the demand is more than the forecasted sales, the Purchase orders can be expedited to meet the unexpected demand. This can also lead to decisions to internally transfer inventory across various locations (Inventory levelling).
  • The price sensitivity determined during the strategic planning phase can be used to increase lagging sales. It can be decided to run promotions to boost the sales to the expected values.
  • Some of the variations in supply chain, like delay in shipments by vendors can be handled by either inventory levelling or expediting other POs on order.
  • The advanced feature of Text analytics can be used to forewarn of potentials disruptions to the supply chain and precautionary steps can be taken to avoid any impact to the Inventory.

automation-workflow

Conclusion:

The new age of data science and big data technology opens new vistas for automating the hitherto manual process of supply chain optimization. Technologies like Hadoop enable working with SKUs running into millions of counts and historical data running into several years with billions of transactions. The integration of machine learning libraries in tools like Spark has brought predictive analytics into the mainstream.  Latest Lambda and Kappa architectures enable streaming processing of near real-time data and creation of predictive models which can respond to changes in business patterns. The above process can greatly improve the performance of the supply chain and thus the overall business.

Video of the webinar :-

Contact us at feedback@bistasolutions.com for a free evaluation of how big data can be leveraged to provide you a competitive advantage.

 

How to Launch your own Magento 2 store

Launch your own Magento Store

To Download the latest Magento 2 from the Magento Ecommerce website.

Please refer the link: www.magento.com/download

Prerequisites for Magento 2 are as follows:

  • Apache 2.2 or 2.4
  • PHP 7.0.2, 5.6.x or 5.5.x (PHP 5.4 is not supported)
  • MySQL 5.6.x

Assuming you have the Apache, PHP and MySQL ready. And it matches the prerequisites mentioned above for e-commerce Magento shopping sites.

Let’s Start:

STEP 1: Extract the Magento 2 folder for Magento store and make it accessible through the web server. Recommended to put it in Html folder if its apache2.4 or you can create a virtual host and make Magento 2 accessible from anywhere .

STEP 2: Place the Magento 2 accessible URL in your web browser. Follow the Magento 2 installation instructions, suggested step by step that are displayed on your browser. Now, you will be able to successfully install a Store and an Admin panel of your Magento website.

WE RECOMMEND: During the installation Magento 2, check for few PHP and apache libs to be pre-installed before installing Magento 2. Make sure you have them ready, else you will not be able to proceed further with your installation.

Magento 2 setup will guide you through.

STEP 3: At the end of the installation Magento 2, Asks for magneto admin unique URL. Make sure you remember it or make a note of it before you proceed ahead along with the admin Username and Password.

Installing Magento 2 is one aspect but setting up a store for your product is entirely different. There are a couple of things that you need before you launch your online stores like Products, Categories, Store URL, Secure URL, Email Configurations, Contact Detail, Payment Gateway and Shipping method details. Magento is one of the best e-commerce platforms.

You can send us your comments on feedback@bistasolutions.com

Automation testing – Myths and Realities

It is always very important to analyse what purpose does a particular technology serve before adapting it into your organisation.Even though Automated Software Testing has several known advantages like High productivity,Faster Regression,Quick Feedback to Development teams,Increased ROI to name some, not all organisations can adapt to Automated Software Testing and replace Manual Testing.A Lot of Testers have superstitious beliefs that Automation Testing is better than Manual Testing and the former testing can replace the later,however, this is true only in a few circumstances.A testing team should be aware of the Myths and the Realities of the Automation Testing and then jump to accept it.Here are a few misnomers of Automation Testing followed by their realities.

#Automated Software Testing is Fast! -Myth

#Well Automation testing does consume time! – Reality

Automated Software Testing can help the organization in a big way when used in the right way and with the right set of expectations. But for this to be possible, we have to put in some time, money and most importantly patience.Testers need to understand the domain, the test cases to be automated and then choose a framework accordingly to Build automated scripts. This will result in strong foundation building for further challenges to come.

The amount of efforts to be put in for Automated Software Testing is equal to the amount of efforts that are put in for developing an application which needs thorough validation. Automation testing scripts must be scrutinized properly keeping every possible set of test data under consideration which also includes negative testing. Failing to do so and handing over a partly tested tool consequently leads to failure of automated scripts during execution, as a result of which you tend to lose confidence in the tool…

#Automated Software Testing is a Replacement for Manual Testing! – Myth

#Automation Testing does prove to be better than Manual Testing,but not always!-Reality

Just the way robots cannot replace humans on earth , automated machine testing will never be able to replace the manual testing capabilities completely . Rather it is unrealistic to believe that automation testing is a replacement for manual testing. A project will always need a human brain to analyze the test results for applications that are unstable and change frequently.In this case, automation testing is used only as a reference and not a replacement.Automation testing is best suited for applications which are static, independent of other modules and needs to be checked during regression testing or for applications whose development is complete.

#Automated Software Testing has Quick ROI! -Myth

#Automation Testing’s ROI is a long term return! – Reality

While implementing Automated testing Solutions ,apart from just writing the test scripts there are also a few interrelated software developmental tasks that are involved.First of a framework that can support the testing operations has to be developed , which a huge task in itself and will require highly skilled people to work on it.However even if a team decides to us a fully developed framework , the initial test case checks will take more time than manually executing the test.So if an application is still in the developmental stage and requires quick feedback , Test Automation is not the right action. The ROI of Automation Testing is, therefore, a long run action plan.

#Automated Software Testing hold good for any Test Case Scenario! -Myth

#Automated testing at GUI layer is always a critical problem! – Reality

Automation Testing to check the process flows, to check user experience with the application or to check the integration with 3rd party application works considerably well.But when it comes to using Automation Testing for checking the functionality of a GUI of a system this will have a setback.GUI of a system undergoes frequent changes in their designs and usability,Although the functionality of the UI remains the same and this is the reason why Test Automation for a UI constantly fails.Having Automation Testing applied on UI is also slower in speed and so is the feedback to the developers from Automation testing.

#Expecting Cent % Automation without any Failure! – Myth

#Executing Automated Software Testing without a failure is practically Impossible! -Reality

There can be several reasons why test scripts (software testing program) can fail in their execution.Be it due to data variation or environment issues(down), or network issues (failure),or changes in the UI failure of Test cases cannot be ruled out.

Conclusions :

Automation Testing is undeniably a prime strategy for any Testing team yet not all organisation sail through in adapting it. This can be addressed by taking care of the following points :

  • Before adapting Test Automation first do some homework on understanding the application which has to be automated,this will help in setting the right deadlines and expectations.

  • Discuss and decide with the team as to what are key areas that need to be automated.

  • Automation Testing is only for stable and developed applications and not for those applications that keep changing from time to time.

  • Do not be afraid of tests that are constantly giving wrong results instead keep faith and aim at a clean and reliable test suite.

Please Feel Free to write your feedback to us on how this blog helps you with understanding the Realities of Automation testing on – feedback@bistasolutions.com

5 Statistical Methods For Forecasting Quantitative Time Series

Times Series Algorithm

Time is one of the most important factors on which our businesses and real-life depend. But, technology has helped us manage time with continuous innovations taking place in all aspects of our lives. Don’t worry, we are not talking about anything which doesn’t exist. Let’s be realistic here!

Here, we are talking about the techniques of predicting & forecasting future strategies. The method we generally use, which deals with time-based data is nothing but “Time Series Data” & the model we build IP for that is “Time Series Modeling”. As the name indicates, it’s working on time (years, days, hours, and minutes) based data, to explore hidden insights of the data and trying to understand the unpredictable nature of the market which we have been attempting to quantify.

Contact Us

TIME SERIES:  

The time series data used to provide visual information on the unpredictable nature of the market we have been attempting to quantify and trying to get a grip on that.

An Ordered sequence of observations of a variable or captured object at an equally distributed time interval. Time series is anything that is observed sequentially over time at regular intervals like hourly, daily, weekly, monthly, quarterly, etc. Time series data is important when you are predicting something which is changing over time using past data. In time series analysis the goal is to estimate the future value using the behaviors in the past data.

There are many statistical techniques available for time series forecast however we have found a few effective ones which are listed below:

Techniques of Forecasting:

    • Simple Moving Average (SMA)
    • Exponential Smoothing (SES)
    • Autoregressive Integration Moving Average (ARIMA)
    • Neural Network (NN)
    • Croston

METHOD-I: SIMPLE MOVING AVERAGE (SMA)

Introduction:

A simple moving average (SMA) is the simplest type of technique of forecasting. A simple moving average is calculated by adding up the last ‘n’ period’s values and then dividing that number by ‘n’. So the moving average value is considered as the forecast for the next period.

Why Do We Use SMA?

Moving averages can be used to quickly identify whether selling is moving in an uptrend or a downtrend depending on the pattern captured by the moving average.

i.e. A moving average is used to smooth out irregularities (peaks and valleys) to easily recognize trends.

SMA Working Example:

Let us suppose, we have time series data, to have a better understanding of SMA, Where, we have the graphical view of our data, in that we have twelve observations of Price with an equal interval of time. After plotting our data, it seems that it has an upward trend with a lot of peaks and valleys.

Conclusion: The larger the interval, the more the peaks and valleys are smoothed out. The smaller the interval, the closer the moving averages are to the actual data points. The SMA deal with historical data having more and more peak and valleys. Probably it would be stock data, retail data, etc.

METHOD II: EXPONENTIAL SMOOTHING

Introduction:

This is the second well-known method to produce a smoothed Time Series. Exponential Smoothing assigns exponentially decreasing weights as the observation gets older.

Why Do We Use Exponential Smoothing?

Exponential smoothing is usually a way of “smoothing” out the data by removing much of the “noise” (random effect) from the data by giving a better forecast.

Types of Exponential Smoothing Methods

  • Simple Exponential Smoothing: –

If you have a time series that can be described using an additive model with a constant level and no seasonality, you can use simple exponential smoothing to make short-term

forecast.

  • Holt’s Exponential Smoothing: –

If you have a time series that can be described using an additive model with an increasing or decreasing trend and no seasonality, you can use Holt’s exponential smoothing to make

short-term forecasts.

  • Winters’ Three Parameter Linear and Seasonal Exponential Smoothing: –

If you have a time series that can be described using an additive model with increasing or decreasing trend and seasonality, you can use Holt-Winters exponential smoothing to make short-term forecasts.

Graphical Views:

Exponential Smoothing:

Here, we have an alpha value that is smoothing constant and this method is called the simple exponential smoothing method which considers the other two factors as constant (i.e. Seasonality & Trend). Double’s (Holt’s) Exp. Smoothing & Winter’s Exp. Smoothing Methods dealing two factors i.e. Trend and Seasonality (i.e. Beta & Gamma).

Conclusion: Larger the alpha, the closer to the actual data points and vice versa. This method is suitable for forecasting data with no trend or seasonal pattern (alpha = Smoothing Constant).

METHOD-III AUTOREGRESSIVE INTEGRATED MOVING AVERAGE (ARIMA)

Autoregressive Integrated Moving Average (ARIMA):

A statistical technique that uses time series data to predict the future. The parameters used in the ARIMA are (P, d, q) which refers to the autoregressive, integrated, and moving average parts of the data set, respectively. ARIMA modeling will take care of trends, seasonality, cycles, errors, and non-stationary aspects of a data set when making forecasts.

Understanding ARIMA Model in General Terms: –

How to Understand ARIMA model?

To understand this, we can refer to a real-time scenario that is a sugar cane juicer, from the juicer it is difficult to extract all the juice in one go, so the shopkeeper repeats the process several times till there is no more juice left in the residual. That’s how ARIMA works, the idea with ARIMA models is that the final residual should look like white noise otherwise there is juice or information available in the data to extract.

How Do We Use ARIMA Model?

ARIMA checks stationarity availability in the data, the data should also show a constant variance in its fluctuations over time. Getting the proper information about the parameter used in ARIMA is based on the “identification process” which was purposed by Box-Jenkins.

When Do We Use ARIMA Model?

As we all know ARIMA is mainly used to project future values using historical time series data. Its main application is in short forecasting with a minimum of 38-40 historical data points with a minimum number of outliers. If you do not have at least 38 data points, then it is advisable to look for some other methods.

Working Example of ARIMA

Here, we are trying to understand ARIMA using quarterly European retail trade data from 1996 to 2011. The data are non-stationary, with some seasonality, so we will first take a seasonal difference. The seasonally differenced data are shown in Fig. These also appear to be non-stationary, so we take an additional first difference and maybe next if required. Shown in Fig.

As we considered the seasonal ARIMA model which first checks their basic requirements and is ready for forecasting. Forecasts from the model for the next three years are shown in Figure. Notice how the forecasts follow the recent trend in the data (this occurs because of the double differencing).

Conclusion: – It works best when your data exhibits a stable or consistent pattern over time with a minimum amount of outliers.

METHOD-IV NEURAL NETWORK

Introduction:

ANN: – Artificial neural network (ANN) is a machine learning approach that models the human brain and consists of several artificial neurons. Their ability to learn by example makes them very flexible and powerful.

Why Do We Use Neural Networks?

Neural networks have the strength to derive meaning from complicated or imprecise data, and most of the time can be used to detect patterns and trends in the data, which cannot be detectable easily by the human eye or any computer techniques. We also have some of the advantages of NN like Adaptive learning, self-organization, real-time operation, and fault tolerance.

Applications of neural networks

Now a day, in every field NN is equally important, for example, in some of the fields I have listed below: –

  • Sales Forecasting

  • Industrial Process Control

  • Customer Research

  • Data Validation

  • Risk Management

  • Target Marketing

Conclusion:

We can use NN in any type of industry and get benefits, as it is very flexible and also doesn’t require any algorithms. They are regularly used to model parts of living organisms and to investigate the internal mechanisms of the brain.

METHOD-V CROSTON

Introduction:

Its modification of exponential smoothing for sporadic demand product time series was suggested by Croston in 1972. The core value of this method is not only the estimation of average demand volume but also the estimation of time interval length between two non-zero demands, a term called as intermittent demand.

The Croston method works in two steps, First, separate exponential smoothing estimates are made of the average size of demand. Second, the intermittent demands are calculated. This is then used in a form of a constant model to predict future demand.

How Croston’s Work?

Croston’s has a complex formula, however, the output is very simple. The screenshot below explains what Croston’s does in a very simple way for the sake of understanding.

Above is the 12-month average vs. Croston’s vs, while below is the 5-month average vs. Croston’s.

As you can see, Croston removes the periods that have no demand only averaging the periods that have demand. Next Croston calculates the frequency of the demand. The math behind this is complex, but the output is extremely similar to performing exponential smoothing.

Why Do We Use CROSTON?

In the given fig. we have two Croston’s forecasts based on demand histories, with more non-zero data points. Here Croston’s will come into the picture and show its benefits.

  • At the very beginning, Croston starts detecting cyclic and periodicity in the data points of demand patterns. In this case, it is suggested that demand could occur possibly after a 3.5 (4 after roundup) zero period.

  • The second most important thing which Croston does is, adjusts the next occurrence from the last non-zero period if the recent periods are zero periods.

So the objects of the forecast are predicting the consumption at the right moment with the right quantity. Croston does try to predict the “right moment”, which is more sophisticated than the moving average.

Conclusion:

The Croston method is a forecasting strategy for products with intermittent demand. In the univariate forecast profile, choose forecast strategy.

Croston’s can be easily emulated with exponential smoothing and any timing benefit is usually adjusted by order lot sizing, and or safety stock in supply planning. Therefore, demand history must not only be lumpy but must also be very low for Croston’s to be of value. Therefore, Croston’s can be seen as a specialty forecasting method that provides value in certain limited circumstances.

For more information on the Statistical method for forecasting or any such type of implementation, you can simply reach out to us at sales@bistasolutions.com. If you’d like to implement software with forecasting tools for your business, get in touch using our contact form.

Data Selection, Gathering and Preparation for Demand Forecast

Data Selection, Gathering and Preparation for Demand Forecast

Data Selection, Gathering & Preparation for Demand Forecasting

Usually, it’s been observed that the database from where the report is fetched contains a collection of mixed data which includes data used for processing the software, data that contains the configuration values, transactional level data, and so on. So, selecting the right kind of data and gathering it together to give a relevant output on which the next step (i.e. FNS Segmentation) can be applied plays an equal role in better demand forecasting.

Data preparation:

Continuing our previous example, let’s say for demand forecasting for Mint Candies we had to choose all the data available in the backend. Under this condition, for e.g. fields like the name of a salesman who sold these candies and the vehicle information in which it got shipped will be extra information that might not be useful in forecasting the sales for Candies. And also every time a large chunk of data would be synced will result in performance or slowness issues for fetching the data from the report. So, the first thing we need to take care of is about selecting the exact useful data for processing reports.

The next feedback we had it from one of our existing clients who had a common business scenario. Let’s understand it with our example, so now our company has already three existing products i.e. Mint Candies, Bar Chocolates, and Luxury Dark Chocolates. But to progress further, a new product has been launched in the middle of the financial year e.g. Jelly Beans. And by applying the same pattern and FNS Segmentation even Jelly beans started showing their progress in sales. But, at the end of the financial year when we will evaluate all our products than the new product Jelly beans even after making a good sale, it would project low sales at the end of the financial year report. The reason behind this would be an introduction of the new product at mid the year compared to the existing product. So, the next thing we need to take care of is tracking the product from the date it has been introduced in the market or the warehouse.

data-preparation

 

Seasonality and Trend:

Moving further in the analysis, we got to know that the product has to be tracked in Season wise like during the times of festivals, regular days, etc., and also based on customers’ tastes certain products do well in one part of the country, and at the same time doesn’t go well in the other part. So, we also need to take care that the products have to be tracked in a geographic way as well. There are many measures for tracking like at the Customer level, Market level, Shop wise, and at times at the Hub level as well. Also, if we expand our example horizon-wise for our industry like Chocolates, Soaps Chips, etc. then under this situation it will become necessary to track our products through their categories as well. So that the performance of each line of business can be tracked.

Segmentation:

Last but not the least, it’s like a trend which now a day most of the industry is adopting. It’s about the Segmentation of data, which ideally means dividing the customer into a set of groups based on their buying pattern and lifestyle and then taking any business step by focusing on a certain group of customers. A simple example would be “A group of customer who buys Luxury Dark Chocolates frequently” can be treated as a Platinum group of Customer and to motivate them, more certain discounts or Value Added Services can be provided to them; to keep them engaged with the product sales. So, to continue or to improve product sales, we need to take care of Data Segmentation as well.

We hope our experiences would help in some way in optimizing or directing your business at any given point in time. Like always, we would like to conclude with; if you like any of our advice or suggestion or if you are looking forward to any of such implementations then you can mail us at sales@bistasolutions.com  or contact us here.

How Principal Component Analysis can reduce complexity in demand forecast when you have too many predictors

predictive analytics

Organizations are facing challenges in managing their margins and keeping up with industry growth. Predictive analytics has helped organizations to be ahead of the competition and bring value to their customers. There are many organizations that have used predictive analytics across departments which have helped them increase market share, cut cost, and retain customers while maintaining healthy margins.

One of the most challenging fields in predictive analytics is demand forecast or demand planning. What is the demand for my product in the market and how much inventory do I need to keep in stock to avoid over/under stocking, these are two critical questions organizations must answer today.

The key factor while forecasting demand is to list down variables that are going to impact the forecast. There has been a great demand for macroeconomic forecasts using many predictors to be able to produce accurate forecasts. Whether ignoring or considering all these relevant variables would definitely influence forecasting accuracy and may result in suboptimal forecasts. Therefore statisticians have been developing effective ways to utilize the information available among these predictors to improve the performance of forecasts.

The principal component analysis is one of the methods that identify a smaller number of uncorrelated variables, called “principal components”, from a large set of data. The objective of principal components analysis is to simply obtain a relatively small number of factors that account for most of the variations in a large number of observed variables

Let’s look at an example –

Say we want to analyze customer responses to several characteristics of four types of candies ( Dark, Caramel, Mint, Bar): shape, size, texture, color, packaging, smell, taste, and price. This step is known as product classification (refer picture a)

picturea

 

We need to determine a smaller number of uncorrelated variables which will help in reducing the complexity while forecasting demand. Principal components analysis will allow us to do that. The results yield the following patterns (refer to picture b):

  • Taste, smell, and texture form a “Candy quality” component.
  • Packaging and shape form a “Desirability” component.
  • Size and price form a “Value” component.

pictureb

This way we can reduce the number of variables and can use these three variables as input for demand forecast analysis that will determine how many candies we will be selling for a particular month/quarter based on historical data. Wants to know more in detail? contact us today!

7 ways Big Data can dramatically change Supply Chain Analytics

7 ways Big Data can dramatically change Supply Chain Analytics

The requirement for managing an efficient supply chain has always been a balancing act between maintaining high service levels and a healthy inventory turnover ratio. There has been numerous studies and research conducted over the years to address the critical issues facing supply chain practitioners. There have also been many software applications and packages which have been custom-built to ensure that “lost-sales” or “stock outs” do not become a sore point in sales review meetings. This has been mostly done at the expense of low inventory turns and overstocking of parts.

The latest developments in big data technology, which is sweeping across many industries and bringing in huge competitive advantages, can be applied equally reliably to address the challenges faced by supply chain professionals. Big data gives the industry an unprecedented power by bridging both structured and unstructured data and presenting information at the practitioner’s fingertips for quick decision making and insights. The following are some major game-changing rules which big data can bring to the practice of Supply Chain analytics.

advanced_analytics

1. Leveraging large Volume of Data: A lot of companies have large volume of historical data running into multiple years, or even decades, in some instances. Hadoop’s distributed storage architecture along with compression technologies like Parquet, Avro and ORC enables efficient storage with very fast access. Thus the huge volume of data, which hitherto was not leveraged to its fullest extent, can now be effectively used for advanced analytics.

Blending_unstructured_data

2. Blending unstructured data for deep intelligence: The availability of NoSQL databases like HBase and Cassandra in the big data landscape enables analytics of unstructured text data which has not been possible until now using legacy Analytics and forecasting packages. This means that information from XML sources for product catalog or web services from suppliers can be integrated in the supply chain decision making process.

advanced_machine_learning_algorithms

3. Advanced analytical models: The Big data community has developed very advanced machine learning algorithms which can be leveraged to used advanced analytical models for forecasting of demand and planning of procurement. Tools like Spark with it’s Machine Learning library (mllib) and R integration in SparkR enable very advanced models to be used on time-series and other data for accurate forecasting and prediction

text_analytics

4. Text analytics: In addition to structured data stored in systems like Hive and semi-structured data stored in HBase, there are numerous tools in the big data toolbox like Elasticsearch and Apache Solr which opens the doors to analyzing text data in various systems. The enormous amount of Textual data can be utilized to gather additional insight about Product feedback, quality and other metrics which can feed into supply chain planning for additional improvements.

ETL

5. External data source blending: External data can add a lot of value to demand forecasting or lead time prediction by leveraging real-time information. The advancement in Big Data technologies enables the supply chain software to respond to our ever changing world in a dynamic manner. Hadoop has been successfully used as an ETL tool to unify such disparate data. The data from such external systems can be used to identify potentially new suppliers with better lead times and prices
agility_in_response

6. Agility in response: Some of the big data components like Oozie, Sqoop, Flume, Kafka and Storm bring the capabilities of doing procurement in real time rather than periodically. These features makes the company’s supply chain more Agile to respond to a spike in demand, a delay in shipment or a sudden requirement in one of the components in a multi-echelon network.
automated_decisions

7. Automated decisions: Gone are the days where supply chain professionals would glean over information in multiple spreadsheets and make procurement decisions. Deep learning systems based on neural networks can now take automated actions based on previously learned data. Moreover these algorithms can get smarter over time by comparing the response against the actual results. If you wish to know more information then get in touch with our team.