mlpack IRC logs, 2020-01-21

Logs for the day 2020-01-21 (starts at 0:00 UTC) are shown below.

January 2020
--- Log opened Tue Jan 21 00:00:26 2020
00:12 -!- ibtihaj [cbd7be0c@] has quit [Ping timeout: 260 seconds]
00:29 -!- kyrre [uid207859@fsf/member/kyrre] has quit [Quit: Connection closed for inactivity]
04:26 -!- percyX [~percyx@] has joined #mlpack
04:36 -!- percyX [~percyx@] has quit [Quit: AndroIRC - Android IRC Client ( )]
05:07 < kartikdutt18Gitt> Hi @himanshupathak21061998, I have left some comments on your PR that helped me in resolving the issue.Could you try them too?
05:25 -!- jenkins-mlpack2 [] has quit [Ping timeout: 248 seconds]
05:27 -!- jenkins-mlpack2 [] has joined #mlpack
06:21 -!- kyrre [uid207859@fsf/member/kyrre] has joined #mlpack
06:58 < HimanshuPathakGi> Hey zoq I have changed as per suggestion thanks for helping and Sorry for slow response
06:59 < HimanshuPathakGi> @kartikdutt18 Thanks that comment helped
07:01 < kartikdutt18Gitt> you are welcome.
08:31 -!- kyrre [uid207859@fsf/member/kyrre] has quit [Quit: Connection closed for inactivity]
09:40 < jenkins-mlpack2> Yippee, build fixed!
09:40 < jenkins-mlpack2> Project docker mlpack nightly build build #589: FIXED in 4 hr 26 min:
09:52 -!- mmm3 [88e8115e@] has joined #mlpack
09:59 -!- mmm3 [88e8115e@] has quit [Remote host closed the connection]
11:50 < Param-29Gitter[m> Hello, I wanted to work on Profiling for Parallelization for GSOC 2020 are there any specific bugs / resources or task which i should try to work on to understand requirement of this project ?. Also if I can have email id for mentor of this project, it would really be helpful.
12:13 < zoq> Param-29: Hello, at the moment we don't have an open issue that is related, but what you could do is to search the codebase and see if you can find a method that you think could be improved e.g. by using openMP.
12:15 < zoq> Param-29: About the mail, we like to communicate via public channels, that way the whole community can participate in the discussion, provide feedback etc:
12:23 -!- AryanJ [2f1f6345@] has joined #mlpack
12:24 -!- AryanJ [2f1f6345@] has quit [Remote host closed the connection]
12:38 < Param-29Gitter[m> @zoq I am just a beginner in openMP so if there are any specific algorithms / first issues which I could work with, please let me know about it .
15:11 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has joined #mlpack
15:47 -!- hunting-party104 [hunting-pa@gateway/shell/] has joined #mlpack
15:52 -!- vancao82 [01343ad2@] has joined #mlpack
15:55 < hunting-party104> Hi everyone so i was looking at `sequential.hpp` and `sequential_impl.hpp` and observed that the constructor takes an argument `const bool model` could anyone please elaborate on what this argument does ,since the comment "Expose the all network modules." in the code isnt helping much. What does that statement mean ?
15:57 < hunting-party104> Does it mean that if it is set to false the entire sequential layer itself acts as a layer to some model ?
15:58 < zoq> Param-29: For an example you could look into the regularized SVD function, also the SIMD code Ryan wrote for the decision tree might be interesting as well, check the gini_gain.hpp file for some more details.
16:02 < zoq> hunting-party104: Right, if true the seq class acts as an layer and will expose every layer that was added, so e.g. the weight initalization will be done by e.g. the FFN class, since Model() will return all layers in this case, if set to false, none of the layers are exposed. This can be helful if a layer is used multiple times, and e.g. the weight initalization is already handled. Hope that makes sense.
16:07 < hunting-party104> <zoq "hunting-party10: Right, if true "> Exposed as in all layers and their weights can be seen ? So if a layer is being used again it will just copy the pre existing one?
16:14 < zoq> We don't copy the weights or layers, here is an example: LinearLayer layerA; SequenceLayer(true) layerB; layerB.Add(layerA); model.Add(layerA); model.Add(layerB).
16:14 < zoq> In this case we added the layerA to the model twice, directly by using model.Add(layerA) and another time via model.Add(layerB), if model = true, the the FFN class will go through the model and first initalize layerA and then layerB, since layerB wraps and exposes layerA the weights are set again, it's the same layer (shared) so it will change/overwrite the first initalization.
16:20 -!- vancao82 [01343ad2@] has quit [Remote host closed the connection]
16:28 < hunting-party104> zoq: Correct me if im wrong, so when an FFN is created with the above configuration it first sees the Direct layer A which is added, Then sees layer B which shows layer A in it so this indirect layer A is pointing to the first one created ?
16:30 < hunting-party104> Im not able to understand why this is helpful since after training in any case both the layer A's would have different weights so wont they just overwrite over each other ?
16:35 < zoq> They don't have different weights they have the same weights, as initalization happens only once, before training. The configuration above isn't useful just an example what model is for, but a useful example would be if you like to branch-out one example is the inception model, in this case you would use multiple layers of type seq layer, and you only like to each layer once.
16:40 < Nakul[m]> Hey zoq i haven't change anything in Softmax_regression_test.cpp but it fails in 2 test i just curious is it was fine before or i just did some mistake? Since callback is finally fixed :)
16:40 < zoq> Nakul[m]: Will check later.
16:41 < hunting-party104> zoq: I see ,ill check out the inception model. Thanks a lot !
16:41 < hunting-party104> Atleast that resolves a lot of doubts which i had
16:42 < Nakul[m]> basically these 2 test suits ```SoftmaxRegressionFitIntercept``` and ```SoftmaxRegressionOptimizerTrainTest```
16:42 < Nakul[m]> > Nakul: Will check later.
16:42 < Nakul[m]> fine
17:24 -!- ibtihaj [cbd7be0c@] has joined #mlpack
17:26 < Nakul[m]> Hey guys ```models``` repo is currently in which version ? just needed in writting cmake
17:27 < Nakul[m]> or shoud i put default 1.0
17:28 < Param-29Gitter[m> :point_up: [January 21, 2020 9:25 PM]( Thanks. I will surely look into it.
18:00 -!- ibtihaj [cbd7be0c@] has quit [Ping timeout: 260 seconds]
18:22 -!- kyrre [uid207859@fsf/member/kyrre] has joined #mlpack
19:33 < Nakul[m]> zoq: rcurtin the reasom for failing these test suites are i think wrong hypothesis ```hypothesis = arma::exp(
19:33 < Nakul[m]> arma::repmat(parameters.col(0), 1, dataset.n_cols) +
19:33 < Nakul[m]> parameters.cols(1, parameters.n_cols - 1) * dataset);```
19:33 < Nakul[m]> here is the reference i have take
19:33 < Nakul[m]> please have a look
19:43 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has quit [Quit: Leaving]
20:03 < zoq> Nakul[m]: Not sure I see the issue, can you eloborate on that?
20:15 < Nakul[m]> well i am not sure because the implementation which i saw from the above and seems bit complicated to explain but
20:15 < Nakul[m]> i observe that when we pass ```fitIntercept``` then this hpothesis is being calculated and and we are getting the error basically some dimensionality error example ```incompatible matrix dimensions: 2x6 and 5x1000``` .That is why thought to ask about it.
20:15 < Nakul[m]> in this both ```SoftmaxRegressionFitIntercept and SoftmaxRegressionOptimizerTrainTest``` test we are passing fiteintercept as true.
22:31 -!- kyrre [uid207859@fsf/member/kyrre] has quit [Quit: Connection closed for inactivity]
22:33 -!- kyrre [uid207859@fsf/member/kyrre] has joined #mlpack
22:42 -!- ibtihaj [cbd7be0c@] has joined #mlpack
22:44 < ibtihaj> Hey I am new in machine learning but really want to participate in gsoc for mlpack can anyone suggest me what project should I choose in gsoc 2020 ??
22:49 < zoq> ibtihaj: The advice I can give you is to go with the project you are excited about; we will add more ideas in the coming days, you are also free to propose own ideas.
22:50 < ibtihaj> So can i propose my idea here just to verify is it legit or not ?
22:51 < zoq> Here or on the mailinglist works just fine.
22:52 < ibtihaj> Thank you :]
--- Log closed Wed Jan 22 00:00:28 2020