Nexeed Data Analytics
FAQs about Nexeed Data Analytics
A brief explenation about the benefits of data analytics and predictive maintenance
How to benefit from predictive maintenance
How to start
How do I choose a predictive maintenance solution?
There is no single best predictive maintenance solution. Predictive maintenance solutions largely depend on and are very specific to the respective machine or process. Above all, the creation of a predictive maintenance solution is a (from descriptive approaches, condition monitoring to predictive, prescriptive and automated solutions).
How long does a data analytics project take?
Resources and time required for a data analytics project depend on a number of factors. The key factors are:
- the project scope and scale,
- the quality and availability of the required data,
- already established understanding of analytics tools and computing infrastructure,
- the skills and knowledge of the analytics team and
- most importantly, support of the management team and the local experts for the data analytics project.
The data analytics team will define a project timeline dependent on the factors listed. Since we take an iterative approach, you’ll have the initial results in 4 to 6 weeks to further refine as the project progresses. These results could range from initial insights that provide a one-time solution to your use case, to a proof of concept for the solution architecture, to a minimally viable product.
What costs are involved in a data analytics project?
For many of our customers, our data analytics consulting services for manufacturing has proved the ideal way to kick start customer data analytics projects. A variety of consulting services are available depending on your needs — including on-site consultations as an option. Most of our consulting services are available for a fixed price. Learn more about data analytics consulting.
The costs for a data analytics project are priced for transparency. In our experience, the majority of our projects initially require approximately ten to twenty working days. The time required may be even less if you just want something like a simple data quality check or feasibility study, but it could also be more depending on the complexity and if you have a very specific infrastructure in mind.
In what cases does data analytics help leverage savings better than more traditional engineering and statistical methods (e.g. Six Sigma)?
Looking at the work habits (e.g. problem solving) and common tools and methods (Excel, Six Sigma, etc.) of the majority of engineers and technicians today, the approach to optimization projects is most often one-dimensional. In the case of end-of-line scrap, for instance, test data is analyzed. This gives you the effect, but not the cause, such as certain failure codes or failed test steps, which is what improvement measures should be targeting.
By using data analytics techniques, you can consider test data (with information on which parts passed the tests and which failed) in conjunction with the respective process and quality data for the final product and for components. Add machine data, traceability data, environmental data, etc. to the mix, and search for correlations to obtain new insights.
Specific algorithms aid in identifying multidimensional cause-effect relationships—such as end-of-line scrap rate increases with a certain failure mode—consisting of: component A from supplier B, press-in force close to lower limit, and machine X shortly before next scheduled maintenance.
Having the right tools on hand to apply these algorithms, as well as the IT infrastructure and computing power to perform multivariate analyses within a reasonable amount of time, are basic enablers for this advanced analytics approach.
How much data do I need to start a data analytics project?
Frankly, you can actually start without any data at all, which might even be better. Are you convinced that data analytics can help your business to be more efficient in the future? Then think about the data you want to store and how to analyze it further. Avoid wasting money on useless infrastructure, but at the same time do not forget potentially necessary identifiers. Our data analytics consulting services can help you with a variety of consulting solutions to find answers to your questions and evaluate first use cases.
The same applies to small data sets (data in the MB range). In this case, you can typically analyze the data very quickly and use it to acquire valuable information and knowledge or even sustainable benefits for your business. When the volume of data reaches several gigabytes, terabytes or more, the need for a proper data processing and storage infrastructure will be inevitable. In these cases, cutting the data you need to store by 10% reduces your costs by 10%, not including the advantages of analyzing the data more efficiently over the long term.
Methods and Practices
What is the difference between data analytics and statistics?
Data analytics does not exist without statistics. Identifying significant observations in an innumerable amount of random noise, defining outliers in standard distributions, or calculating probabilities for possible future events are just some examples of applications of statistics in data analytics.
However, data analytics involves more than just statistics. We understand it as the entire process of creating knowledge from data—ranging from data cleaning, structuring and visualization to efficiently predicting the future in real time.
It’s all about big data, right?
Wrong! Often it is more about quality than quantity. Coherent, well-structured and consistent datasets outperform sheer volume. Your time is well invested if you think about the structure of and possible ways to analyze the data before storing terabytes of data without knowing what to do with it. As we know from experience, you will probably never look at a large portion of your data ever again. Moreover, mass data storage systems work differently compared to traditional systems. Save time and money by thinking about the data format and structure before investing excessively in infrastructure.
What tools do we use for data analytics projects?
We identify the most efficient and practical solution for your use case. Of course, we have standard solutions and tools, but standard isn’t always the best fit. The tools and algorithms depend on many factors. Are you starting from scratch? Do you already have a data storage, processing and analysis infrastructure? How much data do you produce and at what frequency? Do you have specific requirements with regard to processing times, memory or transparency? We can identify and build the right solution for your use case without having to find the problem to our solution.
Generally speaking, the less mature your computing infrastructure is, the more design freedom we have. In this case, we usually start with open source software libraries such as Python or R, which offer a lot of advanced and frequently used algorithm libraries. The scripting languages Python and R make it possible to load data from a wide variety of sources, to transform and clean data and to extract knowledge from the data—ranging from visualization, to correlation analyses, to the most sophisticated systems of supervised and unsupervised machine learning algorithms.
If you actually have big data that you want to analyze, you probably have a distributed storage and analysis infrastructure such as Hadoop. In this case, we will add the intelligence to your systems and use tools that work seamlessly with your infrastructure.
One possible outcome of data analytics are visualizations. This involves using custom solutions to provide self-explanatory graphics or standard tools such as Tableau. Ultimately, understanding the result is more important than understanding the data science magic behind it.
Which algorithms do we use?
No single algorithm can solve all of your use cases. The correct algorithm depends on the input data format, structure and content, as well as on the computing infrastructure, requirements and the use case that needs to be solved. Sometimes it is not clear from the start which algorithm will work best for the problem at hand. In this case, we might start by trying the most commonly used algorithms (e.g. (boosted) decision trees, random forests, neural networks, Bayesian models or clustering) and evaluating the performance in detail.
Sometimes transparency is more important than building the most precise model possible. If you are still not entirely confident about data-driven decision-making, you might prefer an easy-to-explain model like a decision tree or a linear regression model instead of more sophisticated models such as gradient boosted decision trees or deep neural networks.
What is data analytics?
Data analytics is about understanding your data and revealing information that otherwise would never have been discovered. Using intelligent algorithms, for example, trends and outliers can be identified that could precipitate actions to improve a company’s quality and delivery performance or lower costs.
Data analytics goes way beyond Excel analyses. We incorporate the latest data science software and algorithms to identify data relationships to help boost your business.
Data mining? Data science? Data engineering? Data …? What is the difference between these "data" terms?
Many different terms are used to describe a similar process, and they all have a slightly different emphasis, but they are also frequently used as synonyms. The lines between data engineering, data mining, and data science are not set in stone, which is why we prefer the term data analytics. Our focus is not on what term to use, but on how to solve real-world, data-driven problems using state-of-the-art tools and technology.
What can data analytics do for my business?
In manufacturing and logistics, the primary key performance indicators (KPIs) focus on quality, cost and delivery.
In this day and age, data is produced and collected from nearly every step along the value chain. Why not make decisions that take into account all the information available to you? Often the various sources of data are not linked together and analyzed as a whole. Data analytics changes how data is managed and analyzed, with the aim of using your data to its full potential.
Data analytics can help you to implement full transparency across your value chain, automatically find the root causes of quality issues and define the best data-driven strategy to solve these. It doesn’t stop there: data analytics can identify trends, anomalies or bottlenecks, optimize test times, or predict machine failures to optimally plan maintenance intervals.
In short, wherever data is available, data analytics can help interpret information and provide guidance for a more efficient and sustainable future.
Golden rules for data analytics
Following these 5 rules will help get you in the right mindset:
1) Think big, start small. Focus on your useful and realistic projects.
2) Know your data: be thorough in how you collect, prepare and clean the data.
3) Be aware of biases in your data (statistical, sample data selection and filtering, etc.)
4) Get creative with data analytics and machine learning methods. Your favorite model will not solve all issues.
5) Think and communicate! The most successful project team includes domain and data experts who work together to evaluate the results.
How are Data Analytics Service projects implemented?
The procedure of our Professional Services is based on the . We deploy this in Manufacturing Analytics projects and expand it with key "Technical Process Understanding" components. What is the decisive factor? In our experience, a purely data-based analysis is not expedient in most cases. This is why we rely on the technical understanding of the customer's problem in customer projects and primarily seek the technical process within the data context.
Our Data Analytics engineers support you in planning and implementing your analytics projects. You benefit from experts at your side who have many years of operational experience in production environments.