mlpack IRC logs, 2020-02-08

Logs for the day 2020-02-08 (starts at 0:00 UTC) are shown below.

February 2020
Sun
Mon
Tue
Wed
Thu
Fri
Sat
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
29
--- Log opened Sat Feb 08 00:00:52 2020
01:31 -!- dendre [~dendre@138.229.116.186] has joined #mlpack
02:32 < Param-29Gitter[m> @zoq also take a look at #2169 once you have a chance.
02:57 < Param-29Gitter[m> also if there is another program on which i can check my performance, please let me know.
03:34 -!- dendre [~dendre@138.229.116.186] has quit [Quit: Leaving]
04:46 < jeffin143[m]> Zoq : found a weired behaviour , in mac
04:46 < jeffin143[m]> If you bit make and then install it
04:47 < jeffin143[m]> And then change your branch
04:47 < jeffin143[m]> And hit make mlpack_test , it would throw and error
04:47 < jeffin143[m]> Precompiled header have changed recompile it
06:04 < PrinceGuptaGitte> Hi @zoq , you asked me to implement ISRU function as a layer. I understand that all activation functions that have parameters are implemented as a layer so that there parameters can be used. However then, wouldn't they become inaccessible by BaseLayer class? Because BaseLayer is calling its activation function's methods assuming those are static methods.
06:05 < PrinceGuptaGitte> (edited) ... a layer. I ... => ... a layer instead of class with static functions. I ...
06:06 < PrinceGuptaGitte> Doesn't it make it really inconsistent to use different types of activation functions? Am I missing something that binds all of it together?
06:30 < kartikdutt18Gitt> Hi @prince776, I think all activation functions that are implemented in layers aren't present in base_layer. Base_layer only consists those activation functions are present in activation_functions folder. This is done to serialize them as layers. The issue that @zoq mentioned was that if I declare a layer with parameter alpha = 0.5 then I can't change it again for that layer i.e. it's not accessible as it is a
06:30 < kartikdutt18Gitt> function parameter. What I think, @zoq wanted to be implemented was that if I declare layer = ISRU(0.5) layer then to change alpha I can simply do layer.alpha() = 0.7
06:38 < PrinceGuptaGitte> Then every activation function that has parameters needs to be implemented as a layer class. I was just wondering is that a good thing? because now we have two ways to make a layer
06:38 < PrinceGuptaGitte> 1) BaseLayer<ActivationFunction , Inp , Out> layer; .//only activation functions which are in activation_functions folder
06:38 < PrinceGuptaGitte> 2)HardTanH hadTanHLayer;
06:39 < PrinceGuptaGitte> (edited) ... 2)HardTanH hadTanHLayer; => ... 2)HardTanH hardTanHLayer;
06:39 < PrinceGuptaGitte> (edited) ... 2)HardTanH hardTanHLayer; => ... 2)HardTanH hardTanHLayer; // layer for activation functions which have paramter(s)
07:04 -!- saksham [6ee39df1@110.227.157.241] has joined #mlpack
07:07 -!- saksham [6ee39df1@110.227.157.241] has quit [Remote host closed the connection]
08:35 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has joined #mlpack
08:38 < jenkins-mlpack2> Yippee, build fixed!
08:38 < jenkins-mlpack2> Project docker mlpack nightly build build #607: FIXED in 3 hr 24 min: http://ci.mlpack.org/job/docker%20mlpack%20nightly%20build/607/
09:35 < kartikdutt18Gitt> Hi @zoq, when you get a chance have look at the new benchmarks in #2178 .
10:07 < PrinceGuptaGitte> Hi @kartikdutt18 thanks for your input earlier. I have implemented ISRU function as a layer(like elu). I've also run tests with no errors. Before pushing the code I was wondering should I also remove the earlier implementation where alpha was unusable?
10:33 < GauravSinghGitte> Hey everyone, I am done with the implementation of the CELU activation function but have doubts regarding the citation of the paper in the code. The link to the original paper is https://arxiv.org/pdf/1704.07483.pdf. Can somebody tell me how to cite it?
10:48 < kartikdutt18Gitt> @prince776, I will take a look.
10:50 < kartikdutt18Gitt> @gaurav-singh1998, take a look at mish activation function, the paper refering to mish has been cited
10:55 < PrinceGuptaGitte> thanks I've pushed it. I hope the implementation is complete
10:56 -!- M_slack_mlpack_4 [slackml_21@gateway/shell/matrix.org/x-wizriasevapuhqse] has joined #mlpack
10:58 -!- M_slack_mlpack_4 is now known as Saksham[m]
10:58 * Saksham[m] sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/mHuNaBzqDWraKjPQEOlNqMMv >
11:02 < pickle-rick[m]> Saksham: You should check out https://www.mlpack.org/community.html and https://www.mlpack.org/gsoc.html to get started. :)
11:03 < pickle-rick[m]> Saksham: You might find this link helpful as well: https://github.com/mlpack/mlpack/blob/master/CONTRIBUTING.md
11:05 < Saksham[m]> <pickle-rick[m] "Saksham: You should check out ht"> Should I look forward towards working on specific project related task(the project I am interested in) or should I resolve general issues for the start ?
11:07 < pickle-rick[m]> I'm my opinion, general issues would be a great way to get familiar with the code-base, but it's upto you really. You could open up issues / pull-requests for the projects you're interested in as well.
12:36 < zoq> jeffin143: I wonder if this has something to do with cotire (caching), we could disable cotire and see if that helps.
12:45 < zoq> PrinceGupta: You could use BaseLayer<MishFunction, InputDataType, OutputDataType>, and I agree that this isn't the same as using ELU, but a user proably never uses BaseLayer directly and rather uses the alias defined at the end of the file: https://github.com/mlpack/mlpack/blob/master/src/mlpack/methods/ann/layer/base_layer.hpp#L208 -> MishFunctionLayer, so the interface is the same. Using the BaseLayer
12:45 < zoq> avoids code duplication.
13:13 -!- wiking_ [~wiking@huwico/staff/wiking] has joined #mlpack
13:14 -!- wiking_ is now known as wiking
13:28 < jeffin143[m]> Zoq : if cotire (caching ) help a user of Mlpack to speed up things , i don't mind as a developer to hit make mlpack again, i agree it make a developers life hard since he has to it make again and that would mean 1 hr of time, but not sure if we should remove caching
13:28 < jeffin143[m]> I should probably have a go through it once
13:36 < PrinceGuptaGitte> Hi @kartikdutt18 I've tried but doing it the way you suggested that is:
13:40 < PrinceGuptaGitte> HI @kartikdutt18 I've tried so much but I can't seem to figure out a way around it. You suggested to use:
13:40 < PrinceGuptaGitte> `x = (x != 0) * arma::pow(y / x, 3) + (x == 0) * DBL_MAX;` , but this still generates nan values, as we are still adding this term `(x != 0) * arma::pow(y / x, 3)` even when x == 0 which is = 0 * 0/0 = nan. I tried some alternate routes but there weren't working either
13:42 < jeffin143[m]> rcurtin : do you work in developing julia ?
13:43 < jeffin143[m]> Also i am so happy to see Mlpack grow by leaps and bounds :)
13:43 < jeffin143[m]> Saturday Sunday , probably can finish up a pr or some issue or help with the review
13:44 < PrinceGuptaGitte> Like, since y = 0 only for x = 0, i tried to replace 0s in x and y with 1s so that when we divide we get 1(the required derivative) but this doesn't work either since y is const, and copying the matrix will be very slow for performance
13:45 < PrinceGuptaGitte> > HI @kartikdutt18 I've tried so much but I can't seem to figure out a way around it. You suggested to use:
13:45 < PrinceGuptaGitte> > `x = (x != 0) * arma::pow(y / x, 3) + (x == 0) * DBL_MAX;` , but this still generates nan values, as we are still adding this term `(x != 0) * arma::pow(y / x, 3)` even when x == 0 which is = 0 * 0/0 = nan. I tried some alternate routes but there weren't working either
13:45 < PrinceGuptaGitte> due to same reason border value checking in Inverse function is also not working.
13:50 < kartikdutt18Gitt> Ohh I understand because boolean will still give logic 0 that will result in division by zero.
13:51 < kartikdutt18Gitt> I think rather than this you can use for loop to make y(i) = Deriv(x(i))
13:51 < kartikdutt18Gitt> and include this condition in Deriv condition in case you haven't already done so.
14:00 < PrinceGuptaGitte> I considered that but won't that be slow?
14:01 < kartikdutt18Gitt> Quite contrary.
14:01 < kartikdutt18Gitt> refer #2178
14:01 < PrinceGuptaGitte> Thanks
14:02 < PrinceGuptaGitte> I thought armadillo vectorized matrix operations like bumpy
14:03 < PrinceGuptaGitte> Numpy*
14:03 < kartikdutt18Gitt> I tested both just in case.
15:19 < PrinceGuptaGitte> @kartikdutt18 I updated the code with for loop and all edge case detection in Deriv and inverse functions. All tests ran perfectly and there were no warnings during building. Please take a look when you get time.
15:19 < PrinceGuptaGitte> Sorry I think I might have pinged you too much this time, I'll try to do this less often so as not to disturb.
15:44 -!- togo [~togo@2a02:6d40:3486:a701:a9b0:c28b:e397:2299] has joined #mlpack
15:53 < kartikdutt18Gitt> No worries, I will take a look.
17:10 -!- togo [~togo@2a02:6d40:3486:a701:a9b0:c28b:e397:2299] has quit [Remote host closed the connection]
17:10 -!- togo [~togo@2a02:6d40:3486:a701:a9b0:c28b:e397:2299] has joined #mlpack
17:15 -!- togo [~togo@2a02:6d40:3486:a701:a9b0:c28b:e397:2299] has quit [Quit: Leaving]
17:16 -!- togo [~togo@2a02:6d40:3486:a701:a9b0:c28b:e397:2299] has joined #mlpack
17:52 -!- UmarJ [~UmarJ@111.68.97.205] has quit [Ping timeout: 260 seconds]
17:57 -!- wiking [~wiking@huwico/staff/wiking] has quit [Remote host closed the connection]
18:36 < pickle-rick[m]> hey having a little trouble building a test file executable... How exactly do you modify and run the cmakefile to build the tests? Thanks.
19:21 < zoq> pickle-rick: Hello, are you trying to add a new test?
19:21 < zoq> pickle-rick: Btw. nice picture :)
19:23 < pickle-rick[m]> No, I just want to run the existing q_learning_test file. Glad you liked the pic :)
19:26 < zoq> pickle-rick: I see, in this case you can just build mlpack, that will produce an executable called mlpack_test.
19:26 < zoq> To run all test of the QLearningTest test suite you could use: bin/mlpack_test -t QLearningTest or to run only one test use: bin/mlpack_test -t QLearningTest/CartPoleWithDQN
19:26 < pickle-rick[m]> Oh cool. Thanks
19:27 < pickle-rick[m]> I'd like to know how to go about adding new tests as well. If you could point me towards some resources, that'd be helpful.
19:29 < pickle-rick[m]> Or should I just look at the tests folder, and figure out the general pattern?
19:31 < zoq> So we use boost Unit Test Framework: https://www.boost.org/doc/libs/1_60_0/libs/test/doc/html/boost_test/tests_organization/test_suite.html for testing.
19:31 < zoq> To add a new test, you have update tests/CMakeLists.txt, add the new test file, for the test file itself you can take a look at the exsisting tests for example.
19:34 < pickle-rick[m]> Ah I see. Appreciate the info!
19:34 -!- ocelaiwo[m] [ocelaiwoma@gateway/shell/matrix.org/x-pbnahdjgtbwstqhr] has joined #mlpack
21:41 -!- wiking [~wiking@huwico/staff/wiking] has joined #mlpack
22:01 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has quit [Quit: Leaving]
22:20 -!- wiking [~wiking@huwico/staff/wiking] has quit [Remote host closed the connection]
--- Log closed Sun Feb 09 00:00:54 2020