mlpack IRC logs, 2018-04-14

Logs for the day 2018-04-14 (starts at 0:00 UTC) are shown below.

April 2018
Sun
Mon
Tue
Wed
Thu
Fri
Sat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
28
29
30
--- Log opened Sat Apr 14 00:00:36 2018
00:36 -!- manthan [6725c94b@gateway/web/freenode/ip.103.37.201.75] has quit [Ping timeout: 260 seconds]
00:54 < windows> Hey, I spent 3 days trying to build mlpack on Ubuntu 14.0, each time it failed to finish building because my pc ran out of ram. I tried doing the windows tutorial on building but that didn't work. So my alternative for mlpack on windows is to use the Windows Linux Subsystem. So far I have had more success with it, as not it has finished building and I got it installed.
00:59 < rcurtin> windows: are you building on a VM for Ubuntu? you can increase the amount of RAM it needs
01:00 < rcurtin> I believe that mlpack needs 1.5GB+ of RAM to compile successfully; this is because of the templates in the C++
01:00 < rcurtin> also for ubuntu you can be quick and just do 'apt-get install libmlpack-dev', but that is less useful if you are planning on modifying the library and not just using it
01:10 < windows> I was building on a cloud vm and the 'apt-get install libmlpack-dev' does not have ann.
01:17 -!- windows [ac00a108@gateway/web/freenode/ip.172.0.161.8] has quit [Ping timeout: 260 seconds]
01:34 < rcurtin> windows: oh, sorry, I see you said Ubuntu 14.04
01:34 < rcurtin> you would need, I think, 18.04 for the newest package... let me check
01:35 < rcurtin> ah, actually, sorry, mlpack 3 is only available in the repos for debian sid/unstable, not yet ubuntu
01:35 < rcurtin> probably the next ubuntu (18.10) will have it
01:35 < rcurtin> but that is a long time to wait :)
03:16 -!- windows [ac00a108@gateway/web/freenode/ip.172.0.161.8] has joined #mlpack
03:25 -!- csoni2 [~csoni@103.81.36.225] has quit [Read error: Connection reset by peer]
03:25 -!- sulan_ [~sulan_@563BE0E4.catv.pool.telekom.hu] has joined #mlpack
03:37 -!- csoni [~csoni@103.81.36.225] has joined #mlpack
03:44 -!- govg [~govg@unaffiliated/govg] has quit [Ping timeout: 276 seconds]
03:45 -!- sulan_ [~sulan_@563BE0E4.catv.pool.telekom.hu] has quit [Quit: Leaving]
03:46 -!- govg [~govg@unaffiliated/govg] has joined #mlpack
03:51 -!- govg [~govg@unaffiliated/govg] has quit [Ping timeout: 256 seconds]
04:30 -!- csoni [~csoni@103.81.36.225] has quit [Read error: Connection reset by peer]
04:45 -!- csoni [~csoni@103.81.36.225] has joined #mlpack
05:05 -!- windows [ac00a108@gateway/web/freenode/ip.172.0.161.8] has quit [Quit: Page closed]
06:09 -!- govg [~govg@unaffiliated/govg] has joined #mlpack
06:36 -!- vivekp [~vivek@unaffiliated/vivekp] has quit [Ping timeout: 240 seconds]
06:38 -!- vivekp [~vivek@unaffiliated/vivekp] has joined #mlpack
06:44 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has joined #mlpack
06:44 -!- luffy1996 [uid281777@gateway/web/irccloud.com/x-rtsuphdsropdmpxt] has joined #mlpack
06:47 < luffy1996> @zoq, I went throught the link you sent me. I think this might make my implementation more complex. What I want to do is add a softmax layer at the end of ann network and then apply catagorical cross entropy to compute the error and gradients. I guess crossentropy is already implemented on mlpack
06:47 < luffy1996> https://github.com/mlpack/mlpack/blob/4bd01bbc98889e1ade49302b79d791275854be37/src/mlpack/methods/ann/layer/cross_entropy_error_impl.hpp . However I am unable to find a softmax layer in mlpack.
06:49 < luffy1996> I can see that softmax regression has been implemented in mlpack at https://github.com/mlpack/mlpack/tree/4bd01bbc98889e1ade49302b79d791275854be37/src/mlpack/methods/softmax_regression. I do not know how can one integrate this with ann layer. Please let me know how can I proceed with this. Thanks
06:49 < luffy1996> @manthan If you have any ideas, feel free to speak. Thanks
06:54 -!- csoni [~csoni@103.81.36.225] has quit [Read error: Connection reset by peer]
07:08 -!- csoni [~csoni@103.81.36.225] has joined #mlpack
07:21 -!- csoni [~csoni@103.81.36.225] has quit [Read error: Connection reset by peer]
07:35 -!- csoni [~csoni@103.81.36.225] has joined #mlpack
07:59 -!- csoni [~csoni@103.81.36.225] has quit [Ping timeout: 268 seconds]
08:11 -!- csoni [~csoni@106.210.229.100] has joined #mlpack
08:31 -!- csoni [~csoni@106.210.229.100] has quit [Read error: Connection reset by peer]
08:32 -!- vpal [~vivek@unaffiliated/vivekp] has joined #mlpack
08:35 -!- vivekp [~vivek@unaffiliated/vivekp] has quit [Ping timeout: 260 seconds]
08:35 -!- vpal is now known as vivekp
08:35 -!- ricklly_ [~ricklly@2001:cc0:2020:4017:417e:bdf6:7d7b:750e] has joined #mlpack
08:45 -!- csoni [~csoni@106.210.202.50] has joined #mlpack
08:46 -!- csoni2 [~csoni@103.81.36.225] has joined #mlpack
08:46 -!- csoni [~csoni@106.210.202.50] has quit [Read error: Connection reset by peer]
09:11 -!- csoni2 [~csoni@103.81.36.225] has quit [Ping timeout: 256 seconds]
09:23 -!- csoni [~csoni@106.210.196.120] has joined #mlpack
09:30 -!- witness [uid10044@gateway/web/irccloud.com/x-shsieenzfjenljwt] has joined #mlpack
09:54 -!- csoni [~csoni@106.210.196.120] has quit [Read error: Connection reset by peer]
09:56 -!- manthan [6725c94b@gateway/web/freenode/ip.103.37.201.75] has joined #mlpack
10:08 -!- csoni [~csoni@106.210.203.161] has joined #mlpack
10:13 < manthan> luffy1996 : softmax layer is P(y=j∣x)=e^(xTwj)/∑e^(xTwk). Since it has trainable parameters, you need to implement Forward(), Backward() and Gradient() functions. Forward () will contain normal formal pass of the layer. Backward() will contain the derivative of error wrt input. Gradient() will contain derivative of error with respect to the trainable parameters. error here means error upto this layer in the backward pass.
10:16 < luffy1996> Does that mean that I have to go ahead and add the sftmax layer for mlpack
10:18 < manthan> https://eli.thegreenplace.net/2016/the-softmax-function-and-its-derivative/ this is a good source for the implementation.
10:19 < manthan> yes, addition of softmax layer to ann/layer module would be better. You can then have the layer added to your model by model.Add<Softmax<>>()
10:21 < luffy1996> manthan: The link is very helpful. Thanks. Would you mind giving me an idea how is crossentropy used in mlpack without softmax. I guess crossentropy is implemented in mlpack.
10:26 < manthan> you can see the ann_layer_test.cpp for the test on crossentropy layer.
10:28 < manthan> input is considered to be a vector with 8 values = 0.5 and target as 8 values = 1. The output is thus - 8*log(2).
10:35 < luffy1996> I get how crossentropy error is used. I think I should go ahead and add a softmax layer for myself. Thanks :)
10:40 < manthan> ya that would be better, i would like to help you in case you want.
10:43 < luffy1996> Sure I will ping you incase any help is needed. Thanks
11:34 < luffy1996> Hie manthan
11:34 < luffy1996> I am confused in the backward for softmax
11:34 < luffy1996> template<typename InputDataType, typename OutputDataType>
11:34 < luffy1996> template<typename InputType, typename OutputType>
11:34 < luffy1996> void Softmax<InputDataType, OutputDataType>::Forward(
11:34 < luffy1996> const InputType&& input, OutputType&& output)
11:34 < luffy1996> {
11:34 < luffy1996> arma::mat maxInput = arma::repmat(arma::max(input), input.n_rows, 1);
11:34 < luffy1996> arma::mat expInput = arma::exp(maxInput - input);
11:34 < luffy1996> // We will normalize the values to get probability.
11:34 < luffy1996> double sumExpInput = arma::sum(expInput);
11:34 < luffy1996> output = expInput/sumExpInput;
11:34 < luffy1996> }
11:34 < luffy1996> template<typename InputDataType, typename OutputDataType>
11:34 < luffy1996> template<typename eT>
11:34 < luffy1996> void Softmax<InputDataType, OutputDataType>::Backward(
11:35 < luffy1996> https://www.irccloud.com/pastebin/rUCrBFWF/
11:36 < luffy1996> Please refer to the snippet. The entire code got rejected while posting because of technical issues.
11:37 < luffy1996> In particular I would like to know how to proceed with the gradients for backward propagation
11:37 < luffy1996> THanks
11:41 -!- csoni2 [~csoni@49.35.17.206] has joined #mlpack
11:42 < manthan> luffy1996 : g will contain the derivative of error with respect to input for Backward()
11:43 < manthan> so the link that i gave you above, calculates the derivative wrt input for softmax layer
11:43 -!- sulan_ [~sulan_@563BE0E4.catv.pool.telekom.hu] has joined #mlpack
11:44 < manthan> it uses quotient rule of differentiation
11:45 -!- csoni [~csoni@106.210.203.161] has quit [Ping timeout: 240 seconds]
11:46 -!- csoni2 [~csoni@49.35.17.206] has quit [Ping timeout: 265 seconds]
11:59 -!- csoni [~csoni@106.210.140.140] has joined #mlpack
12:03 < luffy1996> DO I need to make the entire jacobian matrix
12:08 -!- csoni [~csoni@106.210.140.140] has quit [Ping timeout: 240 seconds]
12:35 < manthan> luffy1996 : i think yes you will have to make it
12:35 < manthan> and it is Si(dij - Sj) right? for i,j element
12:39 -!- csoni [~csoni@106.193.195.20] has joined #mlpack
12:50 -!- csoni2 [~csoni@49.35.17.206] has joined #mlpack
12:50 < luffy1996> something like Jacob()*gy
12:50 < luffy1996> Am I correct here?
12:54 -!- csoni [~csoni@106.193.195.20] has quit [Ping timeout: 240 seconds]
12:54 -!- csoni2 [~csoni@49.35.17.206] has quit [Ping timeout: 265 seconds]
12:58 -!- manthan [6725c94b@gateway/web/freenode/ip.103.37.201.75] has quit [Ping timeout: 260 seconds]
13:01 < luffy1996> @zoq, @rcurtin Any comments ?
13:08 -!- csoni [~csoni@106.193.212.126] has joined #mlpack
13:57 -!- govg [~govg@unaffiliated/govg] has quit [Ping timeout: 260 seconds]
13:59 -!- govg [~govg@unaffiliated/govg] has joined #mlpack
14:45 -!- csoni [~csoni@106.193.212.126] has quit [Ping timeout: 240 seconds]
14:59 -!- csoni [~csoni@106.193.193.24] has joined #mlpack
15:25 -!- csoni [~csoni@106.193.193.24] has quit [Read error: Connection reset by peer]
15:39 -!- csoni [~csoni@106.193.134.1] has joined #mlpack
15:43 -!- robertohueso [~roberto@217.216.127.162.dyn.user.ono.com] has joined #mlpack
15:50 -!- witness [uid10044@gateway/web/irccloud.com/x-shsieenzfjenljwt] has quit [Quit: Connection closed for inactivity]
15:52 < robertohueso> Should we allow CLI / Python binding users to select different metric/kernels/trees for an algorithm on runtime?
15:54 -!- govg [~govg@unaffiliated/govg] has quit [Ping timeout: 256 seconds]
15:56 -!- govg [~govg@unaffiliated/govg] has joined #mlpack
16:06 -!- csoni [~csoni@106.193.134.1] has quit [Read error: Connection reset by peer]
16:19 -!- ricklly_ [~ricklly@2001:cc0:2020:4017:417e:bdf6:7d7b:750e] has quit [Ping timeout: 255 seconds]
16:19 -!- dmatt [~quassel@2001:648:2800:240:c06a:18d0:32af:4c50] has joined #mlpack
16:20 -!- csoni [~csoni@106.193.180.228] has joined #mlpack
16:24 -!- luffy1996 [uid281777@gateway/web/irccloud.com/x-rtsuphdsropdmpxt] has quit [Quit: Connection closed for inactivity]
16:32 -!- csoni [~csoni@106.193.180.228] has quit [Read error: Connection reset by peer]
16:41 -!- Guest59383 [~pd@14.139.61.129] has joined #mlpack
16:46 -!- csoni [~csoni@103.81.36.176] has joined #mlpack
16:47 -!- vivekp [~vivek@unaffiliated/vivekp] has quit [Ping timeout: 245 seconds]
16:49 -!- vivekp [~vivek@unaffiliated/vivekp] has joined #mlpack
17:00 -!- dmatt_ [~quassel@2001:648:2800:240:7076:ed78:cdbe:68da] has joined #mlpack
17:03 -!- dmatt [~quassel@2001:648:2800:240:c06a:18d0:32af:4c50] has quit [Ping timeout: 265 seconds]
17:29 -!- manthan [6725c94b@gateway/web/freenode/ip.103.37.201.75] has joined #mlpack
17:29 < manthan> luffy1996 : yes it is Jacob() * gy
17:29 < manthan> and jacob() can be computed by using the derivative function of softmax wrt input
17:30 < manthan> let rcurtin and zoq clarify in case we missed something
17:34 -!- csoni [~csoni@103.81.36.176] has quit [Read error: Connection reset by peer]
17:36 -!- sulan_ [~sulan_@563BE0E4.catv.pool.telekom.hu] has quit [Quit: Leaving]
17:36 -!- sulan_ [~sulan_@563BE0E4.catv.pool.telekom.hu] has joined #mlpack
17:36 < manthan> rcurtin : zoq : I have updated the decision tree pruning PR and have included it as a template parameter. No extra data members are added for the purpose. I have changed the classProbabilities to take up classCounts. Can you have a look at the updated PR.
17:38 -!- Guest59383 [~pd@14.139.61.129] has quit [Ping timeout: 255 seconds]
17:43 -!- manthan [6725c94b@gateway/web/freenode/ip.103.37.201.75] has quit [Ping timeout: 260 seconds]
17:48 -!- dmatt [~quassel@2001:648:2800:240:6402:ef31:f36f:4161] has joined #mlpack
17:48 -!- dmatt [~quassel@2001:648:2800:240:6402:ef31:f36f:4161] has quit [Remote host closed the connection]
17:49 -!- csoni [~csoni@103.81.36.176] has joined #mlpack
17:51 -!- dmatt_ [~quassel@2001:648:2800:240:7076:ed78:cdbe:68da] has quit [Ping timeout: 255 seconds]
17:51 -!- Guest59383 [~pd@14.139.61.129] has joined #mlpack
17:52 -!- Guest59383 [~pd@14.139.61.129] has quit [Max SendQ exceeded]
17:52 -!- Guest59383 [~pd@14.139.61.129] has joined #mlpack
17:54 -!- Guest59383 [~pd@14.139.61.129] has quit [Max SendQ exceeded]
17:55 -!- Guest59383 [~pd@14.139.61.129] has joined #mlpack
18:13 -!- sulan_ [~sulan_@563BE0E4.catv.pool.telekom.hu] has quit [Quit: Leaving]
18:39 -!- vivekp [~vivek@unaffiliated/vivekp] has quit [Read error: Connection reset by peer]
18:41 -!- manthan [6725c94b@gateway/web/freenode/ip.103.37.201.75] has joined #mlpack
18:41 -!- vivekp [~vivek@unaffiliated/vivekp] has joined #mlpack
18:49 -!- csoni [~csoni@103.81.36.176] has quit [Read error: Connection reset by peer]
19:06 -!- Guest59383 [~pd@14.139.61.129] has quit [Read error: Connection reset by peer]
19:14 -!- sourabhvarshney1 [0e8b7937@gateway/web/freenode/ip.14.139.121.55] has joined #mlpack
19:16 -!- sourabhvarshney1 [0e8b7937@gateway/web/freenode/ip.14.139.121.55] has quit [Client Quit]
19:54 -!- csoni [~csoni@103.81.36.176] has joined #mlpack
20:49 -!- manthan [6725c94b@gateway/web/freenode/ip.103.37.201.75] has quit [Ping timeout: 260 seconds]
20:53 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has quit [Quit: Leaving]
22:46 -!- csoni [~csoni@103.81.36.176] has quit [Ping timeout: 240 seconds]
--- Log closed Sun Apr 15 00:00:37 2018