Welcome to the Neural Designer blog!
Here you can find articles written by professional data scientists explaining machine learning algorithms and applications.
|4 Oct 2021||Marina Castaño|
The mathematics that support Artificial Intelligence (AI) can be difficult to understand.
In this post, we will explain them from the perspective of functional analysis and variational calculus. Read more.
|10 Aug 2021||Roberto Lopez|
Training accuracy can be defined as the minimum reachable error that an optimization algorithm can achieve.
In this post, we compare the training accuracy of three machine learning platforms: TensorFlow, PyTorch and Neural Designer for an approximation benchmark. Read more.
|3 Jan 2022||Álvaro Martín|
Cancer diagnosis, prognosis and treatment models using machine learning algorithms
In this post, we benefit from new technologies to detect, forecast and treat different types of cancer. Read more.
|09 Aug 2021||Laura Álvarez|
Artificial intelligence (AI) aims to build smart machines capable of performing tasks that usually require human intelligence.
In this post, we explain how artificial intelligence can positively impact your company. Read more.
|24 Mar 2021||Laura Álvarez|
There are many fields where data analysis is used. When applied in the educational sector, we talk about learning analytics.
In this post, we explain what learning analytics is and 5 applications: student monitorization, factors influencing learning, student outlier detection, dropout and score prediction and course recommendation. Read more.
|13 Jan 2021||Carlos Barranquero|
In data science and machine learning, capacity is the maximum amount of data that a computer program can analyze.
In this post, we compare the load capacity of three machine learning platforms: TensorFlow, PyTorch and Neural Designer for an approximation benchmark. Read more.
|9 Dec 2020||Carlos Barranquero & Roberto Lopez|
Finding datasets for performance benchmarking might be difficult. Indeed, loose datasets lack the consistency required by key performance indicators such as data capacity, training speed, model accuracy, and inference speed.
This post introduces a family of datasets known as the Rosenbrock Dataset Suite. The objective is to facilitate benchmarking of machine learning platforms. Read more.
|1 Dec 2020||Carlos Barranquero|
TensorFlow, PyTorch and Neural Designer are three popular machine learning platforms developed by Google, Facebook and Artelnics, respectively.
This post compares the GPU training speed of these platforms for an approximation benchmark. Read more.
|10 Nov 2020||Roberto Lopez|
In machine learning, benchmarking is used to compare tools and identify the best performing technologies in the industry.
However, comparing different machine learning platforms can be a difficult task due to the large number of factors involved in the performance of a tool. Read more.
|14 Sep 2020||Alberto Quesada|
The procedure used to carry out the learning process in a neural network is called the optimization algorithm.
The learning problem in neural networks is formulated in terms of the minimization of a loss function. Read more.
|14 Sep 2020||Alberto Quesada & Roberto Lopez|
An outlier is a data point that is distant from other similar points. They may be due to variability in the measurement or may indicate experimental errors.
In this post, we talk about 3 different methods of dealing with outliers. Read more.
|12 Jan 2017||Fernando Gómez, Alberto Quesada & Roberto Lopez|
The genetic algorithm is a stochastic method for function optimization based on natural genetics and biological evolution.
In this article, we show how genetic algorithms can be applied to optimize a predictive model's performance by selecting the most relevant features. Read more.
|10 Jul 2017||Pablo Martín & Roberto Lopez|
Sales forecasting is an essential task for the management of a store.
During this article, we use the sales of a drug store from the last two years to predict the number of sales that it will have one week in advance. Read more.
|25 Nov 2016||Roberto Lopez|
Predictive analytics use data to discover complex relationships, recognize unknown patterns, forecast actual trends, find associations, etc.
They allow us to anticipate the future and make the right decisions. Read more.
|22 Feb 2017 (Revisited: 21 Mar 2020)||Roberto Lopez|
There are many different types of neuron models, from which the perceptron is the most important one.
In this article, we explain the mathematics of this neuron model. Read more.
|28 Mar 2017||Alberto Quesada & Roberto Lopez|
Forecasting of power demand plays an essential role in the electric industry. It provides the basis for making decisions in power system planning and operation.
This article's objective is to explain all the factors that lead to demand change and determine the underlying causes. Read more.
|24 Feb 2017||Pablo Martín|
Depending on the type of problem that we are analyzing, some specific methods may help us study the predictive model's performance in-depth.
In this article, we focus on the 6 most used testing analysis methods for binary classification problems. Read more.
|16 May 2017||Pablo Martín|
Principal Component Analysis (PCA) is a statistical technique that identifies underlying linear patterns in a data set.
It allows expressing a data set in terms of other data set of a significatively lower dimension without much loss of information. Read more.
|25 Nov 2016||Roberto Lopez|
Telemarketing is one of the most used forms of direct marketing by all types of companies. This technique can be effective at generating sales, but it requires a strict selection of potential clients.
Advanced Analytics allows us to select individual targets, which results in increased profitability. Read more.
|30 Mar 2017||Pablo Martín|
Our objective is to analyze a dataset from a grocery store to create a recommendation system.
This system will be capable of generating accurate recommendations about products that the user may have an interest in. Read more.
|7 Dec 2016 (Revisited: 19 Jan 2021)||Pablo Martín|
Advanced Analytics is the set of techniques used to discover intricate relationships, recognize complex patterns, or predict current trends in your data.
The objective is to model data from internal and external variables to obtain useful insights that result in smarter decisions and better business outcomes. Read more.
|23 Jan 2017||Pablo Martín|
Nowadays, the risk assessment process carried out by insurance companies has become obsolete.
By analyzing the available information on the customers stored by the company, we can develop risk models that evaluate new customers faster and more accurately. Read more.
|29 Nov 2018||Roberto Lopez|
In this post, we formulate the feature selection problem and describe the most used algorithms in practice: Growing inputs, pruning inputs, and genetic algorithms. Read more.
|14 Apr 2019||Roberto Lopez|
Neural Designer participates in the R&D project, "Enhancing education and training through data-driven adaptable games in flipped classrooms (FLIP2G)".
This project is framed in the Erasmus+ Programme of the European Union to support education, training, youth, and sport in Europe. Read more.