Odoo 11 & Its Modules

  • by bista-admin
  • Dec 29, 2017
  • 0
  • Category:

Odoo 11 is one of the earliest versions of Odoo. The recent latest version is Odoo 16. Odoo 11 was released and we will highlight some improved modules of Odoo 11 at that time. Also, Odoo 11 is supported on python3 and higher versions only.

Speed:

Odoo 11 had a much better loading speed. Loading speed is 2-3 times faster than Odoo 10.

Menus:

Sales and CRM have separate menus in Odoo 11

Website and Website Admin menu in Odoo 10 have merged into a single menu named Website menu.

Kanban View:

New Kanban Layout with a Useful Progress Bar for example in CRM -> Pipeline.

New App:

Online Appointments:

Allow customers or visitors to book a meeting. It also allows setting reminders through mail and SMS service.

Marketing Automation:

Automate repetitive actions such as email campaigns for the customer. For example, if you want to enroll in every new activity done in the specific campaign it sends mail to a targeted customer in a timely manner to keep up-to-date and also set the automated reminder for any activities.

Accounting Dashboard:

A new Configuration bar is introduced in order to set up and navigate the Company Data, Bank Accounts, Fiscal Year, Chart of accounts, and Initial Balance.

POS:

The Pricelist feature is included.

Sales:

Proforma Invoices – Print Proforma invoices directly from the sale order.

Product Catalog – New menu added to manage product, pricelist, and variants.

Improved Configuration -> Setting:

Global Search across all Settings and switch from the setting of one app to another.

Website:

Improved customize URL in website builder: In Odoo 10, on creating the new page from the website builder the url includes “/page/” by default which is removed in Odoo 11.

Website New Features:

Wishlist – Customers can add products to their wishlist so that they can buy their favorite products faster.

Product Comparator – Allows comparing the products.

Product Availability – Allows to show the quantity of product available and also to make product inactive which is not available.

Coupons and Promotions – Added coupons and promotions system.

Inventory:

Unlock button in Delivery Order – Allows you to update the quantity of product even though it is validated.

Picking Type – New Picking Type i.e. mrp_operation (Manufacturing Operation) is added along with internal, incoming, and outgoing. As you can see in Inventory Dashboard there is one new Kanban as Manufacturing is present which includes Manufacturing Order.

Keyboard Shortcut:

Increased keyboard shortcut.

CRM:

Clicking the Mark Won button in Pipeline – a rainbow with a smile appears to send a message about performance.

Project:

Sale and Invoice – Create projects or tasks on the validation of sales orders.

Subtask – Can create subtask by selecting the Parent Task in both Community and Enterprise.

Task Merge – Can merge the duplicate task.

Timesheets:

Leaves – Leaves can be automatically recorded in the timesheet.

Reminders – Automate emails to remind employees and managers to fill their timesheets.

Uninstall App:

While uninstalling any app wizard shows up using which one can select the modules they want to uninstall and its related records get deleted.

New payment acquirer:

Payumoney and payment Stripe.

Bista Solutions is a longtime Odoo partner and implementation provider. If you have any queries regarding the above highlights of Odoo 11 and also looking forward to Odoo 11 implementation or migration for the existing Odoo, feel free to reach out to us at sales@bistasolutions.com.

How Can Cloud ERP Improve Your Company’s Day-To-Day Business Performance?

Benefits of Cloud Based ERP

Cloud based ERP solutions play a crucial role in a company’s day-to-day operations once it is implemented. Besides reducing hardware data and storage costs and offering flexible solutions that accommodate growth and scaling, the most significant benefit of a cloud ERP is accessibility to real-time information and a streamlined solution. With a cloud based ERP solution, one has access to real-time data across multiple platforms, from desktop to mobile devices.

Cloud based erp

Cloud based ERP solutions maintain an alignment between departments which reduces the issue of missing information, duplication, or other errors. This helps us to generate more accurate and complete information enabling an organization to make more optimal and transparent decision making. It also allows the option of integrating any other application with the ERP platform. Which in turn allows for easier access to complementary/supporting information and decision making. Integration of all departments and functions of a company into a single system that has the multi-functional capacity of streamlining and backing up the processes into a cloud for extreme security and anywhere / anytime accessibility undoubtedly enhances the performance of a company.

Streamlined processes:

Various departments in a company tend to maintain their own data on a number of software. Problems arise when data from one department is to be transferred to another, a tedious process when each department maintains data in different forms, formats or on different software.

Cloud based ERPs are collaborative and integrative with other systems. They not only help integrate software and data. But also link different processes between departments so that communication between them is fluid. For example, one of its greatest strengths is streamlining the operation through which your business takes customer orders and processes them into invoices and revenues. This is the order fulfillment process, which can either be long and complicated – or rapid and efficient. When one department finishes with an order. it is automatically routed via your ERP software and moved along to the next department.

Easy to handle:

  • Cloud based ERP works on Software as a Service (SaaS) which allows users to access ERP software using the Internet.
  • Cloud based ERP has low cost as the resources needed to install, operate, maintain and handled are typically lower rather than purchasing solutions that are hosted “on-premise”.
  • Cloud based ERP also gives companies access to their business-critical applications at any time from any location.

Centralized and collaborative:

Instead of having a company’s front and back-office applications in separate systems. Using one cloud platform allows the apps to communicate with each other and share a central database. This reduces data loss.

  • On the other hand. “on-premise” systems are installed using high setup costs and time by a local IT team. The cost of setup must also include maintenance of hardware, server rooms, and more.
  • When it’s time to upgrade the ERP system. it must be deployed again on the system and the associated users, computers and reimplement various customizations and integrations that were installed on an earlier software. Basically, duplicating these efforts and expenses every so often.
  • A cloud based ERP provider hosts and maintains all of the IT infrastructure. Ensures the system is always running, and that the data is secure.
  • Cloud based ERP automatically adjusts and dynamically provisions additional resources to handle the ERP deployments usually takes 3-6 months compared to the 12 months that it typically takes to implement an on-premise solution.
  • By moving to a cloud based ERP solution. Business processes can improve productivity, decrease costs and increase efficiency on a effective scale.

Please feel free to reach us at sales@bistasolutions.com for any queries on  Cloud ERP. Also, you can contact us by filling the contact us form.

 

Benefits Of Big Data On Cloud Computing

big data cloud
  • by bista-admin
  • Dec 19, 2017
  • 0
  • Category:

Big Data “Evolution”

You can have data without information, but you cannot have information without data.” – Daniel Keys Moran 

The above quote defines the importance of data. Ignoring the importance of big data can lead to be a very costly mistake for any kind of business in today’s world. If data is that important then using effective analytics or big data tools to unlock the hidden power of data becomes imperative. Here we will discuss the benefits of using cloud computing for big data. If you have followed our earlier blogs, we have discussed at length the value of big data and here we will explore it even further.

Today, every organization, government, IT firm and political party considers data as a new and extremely useful currency. They willingly invest resources to unlock insights from collected data in their respective fields which can be profitable if it is adequately mined, stored and analyzed.

The early stages of using big data were mostly based around storing the data and applying some basic analytics modules. Now, as the practice has evolved, we have adopted more advanced methods of modeling, transforming, and extracting on a much larger scale. The field of big data now has the capacity for a globalized infrastructure.

Internet and social media giants such as Google and Facebook were the pioneers of big data when they began uncovering, collecting and analyzing information collected by their users. Back then companies and researchers entered worked with externally sourced data, which was basically drawn from the “internet” or “public data sources”. The term “big data” wasn’t coined until 2010 approximately when they realized the power, need and importance of this information. Given the scope of information, the term “big data” come into the picture. And with that, the arrival of newly developed technologies and processes to help companies to turn the data into insight and profit.

Big Data “Establishment”

The term Big Data is being rapidly used almost everywhere across the planet – online and offline. Before that, information stored on your servers or computers was only sorted and filed. But today, all data becomes big data no matter where you have stored it or in which format.

How big is Big-Data?

Essentially, all the digital data available and combined is “Big Data”. Many researchers agree that Big Data – as such – cannot be handled using normal spreadsheets and any regular tools of database management. Processing of big data requires specialized analytical tools or infrastructures like “Hadoop” database or NoSQL. These tools are able to handle a larger volume of information and in various formats, so that all the data can be handled in a single operation.  “Big data” processing can be basically broken down into four big Vs which are Velocity, Variety, Veracity, and Volume.

Let’s dig in to “big data and a role of analytics” a bit further. The figure below helps to visualize and understand the four big V’s.

four-v-big-data

 

Why should we have big data on the cloud?

There are several reasons for having a big data on cloud. Some of them are discussed below:

Benefits Of Big Data On Cloud ComputingInstant infrastructure

One of the key benefits of a cloud-based approach to big data analytics is the ability to establish big data infrastructure as quickly as possible with a scalable environment. A big data cloud service provides the infrastructure that companies would otherwise have to build up themselves from scratch.

Big data offers all analytics needs in a single roof. It is important to note that cloud-based big data analytics success is dependent on many key factors. Most significant of these is the quality and reliability of the solution provider. The vendor must combine robust, extensive expertise in both the big data and cloud computing sectors.

Cutting costs with big data in the cloud

This offers major financial advantages to participating companies, but how? Performing big data analytics in the house requires companies to attain and maintain big data centres, and maintain the big data centres is more about, that budget can be used in other companies’ expansion plans and policies.

Shifting the big data analytics on the cloud, allows firms to cut costs in terms of purchasing equipment, cooling machines and ensuring security, while also allowing them to keep the most sensitive data on-premise and have the full control on it.

Fast Time to Value

A modern data-management platform brings together master data management and big data analytics capabilities in the cloud so that business can create data-driven applications using the reliable data with relevant insights. The principal advantage of this unified cloud platform is faster time-to-value, keeping up with the pace of business. Whenever there is a need for a new, data-driven decision management application, you can create one in the cloud quickly. There is no need to set up infrastructure (hardware, operating systems, databases, application servers, analytics), create new integrations, or define data models or data uploads. In the cloud, everything is already set up and available.

Conclusion:

Cloud-based data management as a service helps organizations to blend master data and big data across all domains. This union of data, operations, and analytics, in a closed-loop, provides an unprecedented level of agility, collaboration, and responsiveness. All made possible by cloud technologies.

There are many benefits keeping the big data on cloud. For more insights on big data analytics and cloud computing, you can get in touch with us through sales@bistasolutions.com .

Testing in Odoo ERP

  • by bista-admin
  • Nov 30, 2017
  • 0
  • Category:

What is Testing in Odoo ERP?

A critical step of any successful project includes testing. Testing in Odoo ERP to discover bugs is relatively easy when following a basic outline. Below we will walk you through the steps of the testing process and how to apply it in Odoo ERP.

Why Odoo Testing is Needed:

Testing should be done for two reasons:

  1. Verification – Process of making sure that the product behaves the way we want it to.
  2. Validation – Process of making sure that the product is built as per customer’s requirements.

Based on these reasons, two types of testing techniques came into the picture.

  1. White box testing – It checks the internal working mechanisms of a program and the programming skills of the developer. However, what output we get matters the least here. It is also known as glass box testing, transparent testing, and structural testing.
  2. Black box testing – It is the process of checking the outputs of the program. It puts the least stress on how the program is designed and the internal mechanism of the program is not taken into consideration.

Based on how testing is achieved, there are two more types of testing techniques :

  1. Static testing – Most cost-effective testing technique. It can be done through reviewing the documents and source code, inspection, and walk-through.
  2. Dynamic testing – More advanced technique of testing. Developer/ Tester write programs. These programs are supplied with automated test tool(s), and the received output is examined.

Types of Testing:

There are numerous ways to test the software. Some of the Odoo testing techniques are listed below:

  1. Unit Testing – One of the white box testing techniques. It is the testing of the individual unit by the programmer to check if the unit he/she has implemented is producing the expected output against the given input.
  1. Functional Testing – Black box testing to ensure that the specified functionality in the system requirements is working.
  1. Integration Testing – Individual module is integrated with other modules and tested for all functionalities. This process is continued until all modules are integrated and we have one product to be served to the customer.
  1. System Testing – System testing is the testing to ensure that by putting the software in different environments (e.g., Operating Systems) it still works. System testing is done with the full system implementation and environment. It falls under the class of black box testing.
  1. Stress Testing – It is a form of deliberately intense or thorough testing used to determine the stability of a given system or entity. It involves testing beyond normal operational capacity, often to a breaking point, in order to observe the results.
  1. Usability Testing – This testing is completely done from the user’s perspective. It includes questions like, is the interface, user-friendly? can user learn from the system? Can he get help from the system itself if he is stuck somewhere?. It is a black box type of testing.
  1. User Acceptance Testing – This black box type o testing is done by the customer to ensure that the delivered product meets the requirements.
  1. Regression Testing – This black box testing is done after making changes to the existing system to ensure that the modification is working correctly and it is not damaging the other units of the product.
  1. Beta Testing – Beta testing falls under black box testing. It is done by the people outside the organization and especially by those who were not involved in the development process. The aim is to test the product against unexpected errors.
  1. Smoke Testing – Smoke Testing, also known as “Build Verification Testing”, is a type of software testing that comprises of a non-exhaustive set of tests that aim at ensuring that the most important functions work. The results of this testing are used to decide if a build is stable enough to proceed with further testing.

The term ‘smoke testing’, it is said, came to software testing from a similar type of hardware testing, in which the device passed the test if it did not catch fire (or smoked) the first time it was turned on.

The following are the benefits of the smoke testing:

It exposes integration issues.

  • It uncovers problems early.
  • It provides some level of confidence that changes to the software have not adversely affected major areas (the areas covered by smoke testing, of course)

How to Pick Testing Technique

“ Testing is not a phase, It is the process that is part of Software Development Life Cycle ”

Testing starts from the moment a programmer starts creating a program. It is also quite likely to have a need for different testing techniques even after the product is delivered to the customer and the maintenance phase of the SDLC is going on.

Choosing one or more techniques completely depends on the intention of the testing. Based on those requirements, one can choose the testing techniques.

How to Use Testing Cleverly in Context of Odoo

testing

If you are looking out for Odoo or any related module our team can help you to assist in selecting the modules and the number of licenses – For more information, you can email us at sales@bistasolutions.com.

Difference Between ETL And ELT And Their Importance

  • by bista-admin
  • Nov 28, 2017
  • 0
  • Category:

ETL is the most commonly used method while transferring data from source system to destination system or Data Warehouse. And ELT is increasingly in demand in today’s analytical atmosphere. Hence sometimes there are the cases where you might have to use ELT processes also. So what is the difference between these two? How do we use them, how is data loaded and how do we utilize the data in between these processes? We will cover the differences between  ETL and ELT and their importance one by one.

ETL (Extract, Transform and Load) :

Extract, Transform and Load is the process of extracting the data from sources (which is present outside or on-premises, etc) to a staging area, then transforming or reformatting… with business manipulation performed on it in order to fit the operational needs or data analysis, and then loading into the target or destination databases or data warehouse.

1

Typically, at the extraction process data from the source system is loaded into staging area i.e. staging tables for the temporary process. The extract state copies data from sources system to staging tables quickly in order to minimize the time to query to sources system. Transform step involves data manipulation or performing business calculations on the staging tables those are copied from the source system, this step will reduce the time of performing the operation on only the relevant data rather than whole source system before loading into Target Data Warehouse. Once this transformation step is performed then those needful data is loaded into target data warehouse for Business Intelligence purpose.

 ETL uses pipeline approach i.e. data flow from source to the target and transformation engine or scripts takes care data manipulation or calculation between these stages.

Numerous tools are present in the market to do the ETL process such as Talend Data Integration, Informatica, SSIS, etc. ETL is the most common methodology in business analytics and data warehousing projects. And these operations can be performed by custom programming or above-mentioned ETL tools which can undergo Extract, Transform and Load process.

While ETL process overall time consumption is ideally less than other processes, as ETL process involves the only extraction of needful data for the present requirement and data manipulation on that particular data rather than performing the operation over whole data. Hence typically this ETL process is used in many cases. Also, the if the Target system is not powerful then ETL is more economical.

ELT (Extract, Load and Transform) :

As the name suggests, ELT is Extract, Load and Transform is the different sight while looking at data migration or movement. ELT involves extraction of whole data from the source system and loading to the Target system instead of transformation between the extraction and loading process. Once the data is copied or loaded into the target system then transformation takes place.

2

In the ELT process, there is no existence of transformation engine between the extract and load process. The transformation operation is taken care by the target system, so the data is directly used for development purpose and useful business insights. Hence this approach provides better performance in the certain scenario.

The drawback or weakness with ETL process is the limitation of data and the pipeline cannot hold the large data for the operation like sorting before moving to the target system. At nowadays condition how can we examine what amount of data should we required or what amount of data should we need in near future, hence there is the restriction of data in ETL process.

 ELT processes, besides changing the position of two letters, change the overall concept of data management. Instead of restricting or limiting the data, ELT makes available all the data to be copied onto a powerful target system like Hadoop. Hadoop is capable of handling large volumes of data without being file type discriminatory. (e.g flat files, spreadsheets, tables, JSON, images, etc).

Hence the all the data from the source is extracted and loaded onto the target by collecting all the data to be needed for data manipulation, business insights, analytics for the present moment and near future as well.

ELT can overcome or tackle situations like traditional staging area based approaches in order to retrieve required amount data by performing calculations and/or manipulation techniques at target end, thus providing better performance at the business levels with Hadoop like high-end clusters and by applying analytical queries. Hadoop offers scalable data storage and processing platforms so that we can take only required data for the present moment and analyze with BI tool like IBM cogons, etc.

If you have any query for ETL please drop an email at sales@bistasolutions.com. Also, you can write us through feedback@bistasolutions.com and tell us how this blog has helped you.

Importance Of Testing In ETL Processes

ETL Processes
  • by bista-admin
  • Nov 15, 2017
  • 0
  • Category:

ETL Testing Process

ETL processes include data transfers in multiple stages. It starts with the transfer of data from legacy source to the staging server, from staging to production database/data warehouse, and finally from a data warehouse to data marts. Each step is highly vulnerable and prone to errors or loss of data or incorrect transfer of data. This is where the concept of testing comes into the picture in the ETL cycles. The scope of work for any ETL developer does not end with the end of ETL script runs, this is actually the beginning for any developer. A good ETL developer must be able to validate the records and ensure accuracy.

The ETL testing process can be broadly classified into two types:

  1. OLTP (On-line Transaction Processing)
  2. OLAP (On-line Analytical Processing)

OLTP is the Testing of one particular Database Instance and OLAP involves testing of the whole Datawarehouse. This is the most important statement. OLTP does not imply OLAP.OLTP just ensures correct data transfers from a source to a target in one particular database. However, OLAP takes care of the accuracy and performance parameters throughout the data warehouse.

Challenges faced in ETL Testing:

As mentioned earlier ETL process is full of challenges and prone to errors. At every step, the ETL developers are likely to face a minimum of 5 barriers. Here is the list of a few common challenges in the way of  ETL testing :

  • Frequent changes in the business requirements lead to changes of logic in ETL scripts
  • Limited availability of source data
  • Not documenting the “source to target” mapping requirement which leads to ambiguous logic
  • Delay in the output of a complex  SQL query leads to slow working rate
  • Verifying and validating data comes from different sources with varied formats and structures
  • Unstable testing environments
  • The huge volume of data to test

Through this article we at Bista Solutions will convey a few important tests everyone needs to perform to validate the ETL processes:

1. Check the Source and Structure of the Data before deciding on the migration Plan:

This step is a prime step in ETL Testing. This step becomes the foundation for the entire ETL process. With growing complexities in data, understanding the structure of the data in source becomes evidently important and prime. After understanding the structure of the one may need to cleanse the data before it actually loaded into the staging area.

2. Ensure that the mapping document provided is correct:

The second step is to check if the mapping document provided abides by the business requirements of the client and hence ensures correct mapping of fields from source to target tables.

3. Checking and verifying your ETL scripts :

Your ETL scripts must be smart enough to handle null values in data, it must import or update correct data with proper data types, it would be great if the ETL scripts are automated as well to avoid human interactions and as a result of which introduce errors or bugs.

4. Check for Data Completeness:

Once the data is loaded into the target database the first and most important job is to verify the completeness of the data. Also, you need to Verify that all the invalid data is either corrected or removed in accordance with requirements.

5. Performance and Scalability:

Completing the migration once is not the end of the story. ETL developers must anticipate the growth rate of data and thereby keep the system ready to scale up and give a good performance for the huge amount of data as well.

After all these tests have been performed, the project leads need to get a User acceptance test done from the end users so as to ensure the system fits into their requirements without violating the integrity of the system. They might eventually require to perform regression testing as well if there is a new version rollout of the app.

Conclusion:

In the ETL processes, One must understand that data accuracy is the key to arriving at important decisions in any business. Having said that, identifying the bugs, performing root cause analysis of each one of them, and reporting the bugs at an early stage of software development help to reduce the cost and time. Before getting into the ETL testing process, you need to check the different systems, their processes, models, and business requirements for any inconsistencies or ambiguities. ETL developers also need to do data profiling/data mining in order to understand the trends and patterns of data in a better way and identify any source data bugs.

If you have any queries for ETL Testing contact us or drop an email at sales@bistasolutions.com.

ETL Data Transfer

  • by bista-admin
  • Nov 10, 2017
  • 0
  • Category:

In today’s world, A business needs to manage its physical assets they also need to manage the data its organization produces. This is when the ETL tools play a vital role and assist the organization in ETL Data Transfer and remain competitive in the market.

ETL stands for Extract, Transform and Load. Just as the name implies, these tools extract data from a given source this could be a properly structured database or a flat file or data from web apps, or it could be as trivial as data from sensors just in the form of 0s and 1s. The second step is transforming the data while in transit, which involves making the data readable performing complex data type conversion to performing arithmetic/logical operations, and then finally loading the data to the given destination storage.

Some common projects where ETL Data Transfer is a must are :

  1. Pulling up transactional data (sale + purchase) for company heads to work with and generate visualization reports. This is commonly known as Data Warehousing.
  2. Migrating data from legacy systems to new systems due to change application/platform.
  3. Data integration is triggered due to corporate mergers and acquisitions.
  4. ETL could also help in integrating data from third-party suppliers/vendors or partners in the Supply Chain Management Cycles.

image1

This picture depicts how critical ETL tools are in managing data generated throughout your organization.

Which ETL Data Transfer Tool to choose :

Considering the above image, every organization must spend some on R&D to determine which ETL tool they should choose that fits best into their business. Below are some of the criteria any ETL tool must adhere to:

  1. Data Connectivity: Chosen ETL tool must have the ability to connect to any data source no matter where it is coming from. This is critical!
  2. Performance: dealing with a huge amount of data and transforming it definitely requires some great processing capabilities. Hence the ETL tool you choose should be able to scale up with growing data rates.
  3. Rich Transformation Library: Transforming the data manually requires writing thousands of lines of code which is highly prone to errors. So in order to enable smooth data transformation, your ETL tool must extend a rich library of functions and packages which are also as easy as drag and drop facilities.
  4. Data Quality Check: You can never just pick up the data from a given source and start working on transforming it. Your data is never that clean enough; hence, you are not suitable to go. You will definitely require some data cleaning support from your ETL tool.
  5. Committed ETL vendor: As the above points mention all the reasons why ETL is a critical process, it is also important to choose a committed ETL vendor who knows in and out about the tool and can provide good support all throughout the project.

We at Bista Solutions evaluate the business requirements of our clients and in accordance, we offer the best suite of solutions that will cover all the pain areas of our clients and give them an AtoZ solution.

image2

If you have any queries for ETL contact us or drop an email at sales@bistasolutions.com.

Big Data For Better Governance

  • by bista-admin
  • Nov 07, 2017
  • 0
  • Category:

The amount of data that is collected and stored worldwide is more than we can imagine. From the smallest internet user to entire countries on centralized systems, data is gathered and deposited on a continual basis bringing the information era into what could be considered a more mature stage of “big data”. The question then lies in what to do with all that information? Sales and marketing is only one small usage, perhaps we can look at a larger picture where entire industries or governments are involved.

Big Data in Government

Government data is increasing in volume as there is the growth of mobile devices and applications, cloud computing solutions, and citizen-facing portals.  Through these devices, citizens are delivering incredible amounts of detailed personal information. Big data technology lies at the heart of being able to manage and extract any useful information from the databases for the benefit of communities.

Big Data In Defense

Military centers across the globe are designing roadmaps to implement big data within the armed forces. Experts know that conflict engagement is now being shaped and decided with the assistance of data collected. But this data is not only being used for soft decisions, but for designing machines as well so that they will be smart enough to make autonomous decisions when possible. These smart machines will collect additional data and analyze historical collected data to then act according to the processed information.

Big Data in Cyber Security

As more of our lives come “online” connected via IoT, protection against malware is critical. Each day, we know of more instances of malware attacks. It’s obvious that this presents a threat to the integrity of data. When an attack is received, the victim not only needs to stop the attack but also needs to analyze the impact and consequences. To cure this malware activity, several steps are involved including a deep analysis of the code. Big data can help to identify trends, profile and other identifying information about the attacker, as well as understand at a quicker pace, the impact of the attack.

Big Data in Healthcare

Collecting information and data related to the health of citizens of a country could easily help experts working in this industry by giving them an idea of how to improve their nations’ health. The sum collection of all patients data can be analyzed by experts to understand trends and areas of opportunity. In addition, on an individual level, doctors can input raw data of an individual into formulas so that they can deliver personalized healthcare suggestions.

Big Data in Education

The same with other industries, education has collected a lot of data from various schools and educational organizations and is being analyzed by experts for insight on how to improve education. Some areas that are being explored is subject matter, systems improvement, as well as trends and habits in attendance. With the amount of information available beyond academic performance, one can expect many changes in the coming years in all areas of schools and teaching across all levels of education.

Deep analysis has already taken place in some states and has been a huge a success so far. For example, after a detailed analysis of one school’s data, a shocking result was revealed. Apparently, they found a correlation between some of the school dropouts and the availability of toilets in that school. When this kind of useful information is revealed then definitely the country will progress in every aspect of quality.

Big Data in Finance

Finance is one of the most detailed areas we have available for data. Globally, we have been collecting this information for decades and on a daily basis. Big data is now available for loans, mortgages, trading, investments, and more. Analysing the real-time behaviour of clients and providing them with information related to their interests could is beneficial at that moment to make solid financial decisions. It could even be critical when making decisions based on timing due to the fluctuations in the stock market or interest rates.

In summary, big data is very helpful in governance as processing the data and rendering some useful information out of that data could benefit in the growth of any country in any industry.

We hope you like the blog and share it with your network. Please reach out to sales@bistasolutions.com for any query pertaining to Big Data and Analytics solutions.

How To Install Odoo 11 On Ubuntu

  • by bista-admin
  • Nov 03, 2017
  • 0
  • Category:

Odoo 11 is released and this blog is for those who wish to install Odoo 11 in their systems. Since Odoo 11 is supported on python3 and higher versions only, this blog tells you how can one keep Odoo 10 and can run Odoo 11 simultaneously.

STEP 1:

Check if the python 3 or higher version is installed in your system.

Open terminal and type python3.5. If it shows the python terminal as shown in the image below, you do not need to install python3.5 explicitly.

NOTE: Python3.5 is available in Ubuntu 16.04 by default.

code

If python3.5 is not installed, go to terminal and execute the following commands.

1.1 cd /usr/src

1.2 wget https://www.python.org/ftp/python/3.5.2/Python-3.5.2.tgz

1.3 sudo tar xzf Python-3.5.2.tgz

1.4 cd Python-3.5.2

1.5 sudo ./configure

1.6 sudo make altinstall

To check if the python is installed correctly in your system, check with the command, python3.5 in terminal.

 

STEP 2:

Install the python dependencies.

sudo python3.5 -m  pip install pypdf2 Babel passlib Werkzeug decorator python-dateutil pyyaml psycopg2 psutil html2text docutils lxml pillow num2words reportlab ninja2 requests gdata XlsxWriter vobject python-openid pyparsing pydot mock mako Jinja2 ebaysdk feedparser xlwt

STEP 3:

Install and configure the latest version of postgres.

If any old version of postgres is already installed, you can replace the old version with the new one. Follow the below steps.

3.1 Upgrade the Postgres

3.1.1 sudo apt-get upgrade

3.1.2 Get the latest version of postgres from https://www.postgresql.org/download/linux/ubuntu/

3.1.3 To find the installed versions that you currently have on your machine, you can run the following:

                dpkg –get-selections | grep postgres

3.1.4 You can also list the clusters that are on your machine by running.

                pg_lsclusters

3.1.5 Stop using postgres service before making any chnages.

                sudo service postgresql stop

3.1.6 Rename the new postgres version’s default cluster.

                sudo pg_renamecluster 9.6 main main_pristine

3.1.7 Make sure that everything is working fine.

                sudo service postgresql start

3.1.8 Drop the old cluster

                sudo pg_dropcluster 9.3 main

3.1.9 Create the odoo user

                sudo su – postgres -c “createuser -s odoo”

3.2 Fresh installation of Postgres

3.2.1 sudo apt-get install python-software-properties

3.2.2 sudo vim /etc/apt/sources.list.d/pgdg.list

3.2.3 add the following line in the file.

3.2.4 deb http://apt.postgresql.org/pub/repos/apt/ xenial-pgdg main

3.2.5 wget –quiet -O – https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add –

3.2.6 sudo apt-get update

3.2.7 sudo apt-get install postgresql-9.6

3.2.8 sudo su postgres

3.2.9 createuser -s ubuntu_user_name

3.2.10 exit

STEP 4:

Install the js libraries and dependencies.

4.1 sudo apt-get install node-clean-css -y

4.2 sudo apt-get install node-less

4.3 sudo apt-get install -y npm

4.4 sudo ln -s /usr/bin/nodejs /usr/bin/node

4.5 sudo apt-get install python-gevent -y

4.6 sudo npm install -g less

4.7 sudo npm install -g less-plugin-clean-css

STEP 5:

At last, start Odoo 11 server

python3.5 ./odoo-bin –addons addons/

If you are looking out for Odoo 11 implementation for Community or Enterprise edition our team can help you to assist in selecting the modules and the number of licensing – For more information, you can email us on   sales@bistasolutions.com.

Can Cloud ERP Make Your Business Agile?

A cloud solution takes the difficulties of capacity and location off the table by removing the need for on-site hardware infrastructure. It also provides 100 percent real-time visibility. Cloud ERP is also extremely efficient when it comes to changing to new processes hence being highly agile.

In Forrester’s Vendor Landscape for SaaS ERP Applications, principal researcher Paul Hamermen says: “Applications built for SaaS [software as a service] have a tendency to be quicker to muster and better to set up, accelerating time-to-value and permitting lively for enlargement companies.”

“In addition, several Software ERP products – for example, FinancialForce, Oracle cloud ERP and Oracle NetSuite – give you local extensibility surroundings – such as the height as a use (PaaS) – to capacitate business and partners additional coherence to customize applications. “

Cloud ERP

Cloud Enterprise Resource Planning (ERP) is can be used to capacitate businesses to release new ventures fast and support rapid expansion such as in the case of merger acquisitions. In this case, SaaS ERP is best when used to support unfamiliar subsidiaries cost-effectively – an idea known as two-tier ERP. Two-tier ERP is the practice of running two ERP systems at once — one larger system at the corporate level, and one smaller system at the plant, division, or subsidiary level.

Forrester’s investigation has shown that normal ERP companies have been delayed to adopt cloud ERP.

Hamerman believes many average businesses have been late to adopt since they wish to secure a profitable flow of income instead of assuming an expenditure on software.

“Co-existence between SaaS and on-site (or hosted) versions could concede a patron to switch deployment modes in other possible directions, and put on equal terms an otherwise disruptive migration, very well,” says Hamerman

For example, Oracle now offers a SaaS-only ERP product interjection to a merger of NetSuite, while SAP has an internally grown Software ERP offering, Business ByDesign.

“Your obligatory on-premise ERP retailer might offer an appealing emigration trail to SaaS, though it takes the experience to know advantages and costs of this plan – and whether new SaaS offers relevant architectural, coherence and functionality advantages identical to products natively built for Software, ” says Hamerman.

Cloud ERP is not indispensably a singular product, or use, accessible in the cloud. Gartner Analyst recognizes a new epoch of ERP described as “postmodern ERP”. It recognizes postmodern ERP as a strategy that automates and associates executive and functional (such as finance, HR, purchasing, creation, and distribution) with appropriate levels of formation that change features of supplier-delivered formation opposite business coherence and agility.

In Aug 2016, Gartner released the “You No Longer Need A Cloud Erp To Solve Your ERP Difficulties” report. The researcher’s information records that some functions within on-premise ERP mega suites – such as tellurian collateral government and surreptitious buying – are now dominated by SaaS.

Nevertheless, Gartner points out that other functions – such as operational ERP and craving item government – are still mostly on-site or hosted.

“Many firms wish they could ‘lift and shift’ their whole stream on-premise scenery to the cloud, though this is deficient in a universe of postmodern ERP, ” says a report’s author, Christian Hestermann.

Nevertheless, Hesterman’s investigation found that after implementing some cloud-based or Software ERP installations, users and managers realized that a number of the expected “guarantees” were not delivered or may have been more complex than they anticipated.

“While cloud technologies offer options for how to muster ERP systems on opposite levels of a record smoke-stack – infrastructure as a use (IaaS), PaaS or SaaS – they do not renovate ERP alternatives into something completely different, ” says Hestermann. “Cloud technologies alone do not automatically ‘fix’ all of problems compared with on-premise ERP. In reality, they can emanate some new challenges. “

Please feel free to reach us at sales@bistasolutions.com for any queries on Cloud ERP Software and its related modules. Also, you can write us through feedback@bistasolutions.com and tell us how this information has helped you.