Skip to main content

Posts

Showing posts from 2017

Gradient Descent in Dynamo

Hello and Welcome back. So far we have been able to develop a simple regression model and solved it using linear algebra. In this article, I am going to explore another alternative approach called Gradient Descent. It’s an iterative method to optimize a cost function by finding a local/global minimum by taking steps proportional to the gradient of the cost function. To apply the Gradient Descent algorithm we need to first define our cost function for our machine learning problem. We have a set of features for which the actual output is given we want to develop a prediction model to predict the output for the given set of features. We can represent our feature vectors for the complete training set as matrix X , the corresponding output values as vector y , and the predicted values as vector \hat{y} . So our prediction model is \hat{y} = X*a and the error vector for the training set in the model is err = \hat{y} - y . We would like to choose a cost function such that the overall

Polynomial regression and model comparison

Welcome back, in my previous post I described how we can perform linear regression using normal equation in Dynamo and I left with a question "what if the input and output are not linearly dependent?” Let’s say we have a hypothesis that the housing price doesn’t depend on floor area and number of rooms linearly but it has a following relationship as y = a_{0} * x_{0} + a_{1} * x_{1} + a_{2} * x_{2} + a_{3} * x_{1} * x_{2} + a_{4} * x_{1}^2 + a_{5} * x_{2}^2 \: where x_{0} = 1, \: x_{1} is floor area and x_{2} is number of rooms. Then we can introduce few new parameters x_{3} = x_{1} * x_{2}, \: x_{4} = x_{1}^2, \: x_{5} = x_{2}^2 and then perform the linear regression to find the coefficient matrix. The Dynamo graph to setup the feature vector looks as follows. Once the feature vector is setup rest all is same as previous linear regression example. Note that the price prediction for a given floor area and rooms we need to again construct the same feature vector.

Linear Regression in Dynamo

Today we are collecting more data than ever but making sense out of the data is very challenging. Machine learning helps us analyze the data and build an analytical model to help us with future prediction. To create a machine learning prediction model we usually develop a hypothesis based on our observation of the collected data and fine tune the model to reduce the cost function by training the model. One of the very simple hypothesis we can develop by assuming a linear relationship between input and output parameters. Suppose we have data on housing price and we assume that housing prices are linearly related to the floor area of the house then we can use linear regression to predict the price of a house with specific floor area. One of the most important steps towards building the hypothesis is being able to visualize the data and understand the trend. So let first draw a scatter plot of our data as shown in the figure below. I am using Grapher package to draw the scatter p

DesignScript

It’s been a really long time that I wrote about something. For past many years, I have been busy working on a domain specific language called DesignScript and it's visual programming interface Dynamo . I am now moving to a world of reality-capture and I’ll miss working on this technology. I find this technology very useful and every now and then I’ll keep coming to it for some geometric and programming experiment. Recently I am learning machine learning and wondering if I can use Dynamo to explain some of the concepts of machine learning. I’ll explore it further but in the mean time let me explain what is DesignScript and Dynamo. DesignScript is a scripting language intended to provide programming capability to designers. Designers can now describe their design-intent effectively using this language. Unlike other procedural languages, it provides two great features such as associativity and replication. The associativity in expressions helps it preserve the intent whereas the rep