mlpack IRC logs, 2020-02-29

Logs for the day 2020-02-29 (starts at 0:00 UTC) are shown below.

February 2020
Sun
Mon
Tue
Wed
Thu
Fri
Sat
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
29
--- Log opened Sat Feb 29 00:00:07 2020
01:54 -!- kay [2f1fa97f@47.31.169.127] has joined #mlpack
01:55 -!- kay is now known as Guest54148
01:56 -!- Guest54148 [2f1fa97f@47.31.169.127] has quit [Remote host closed the connection]
02:49 -!- tanvi [b64a58a6@182.74.88.166] has joined #mlpack
04:17 -!- tanvi [b64a58a6@182.74.88.166] has quit [Remote host closed the connection]
04:55 < birm[m]1> Nakul: Ive copied modofied versions of the azure pipelines and travis ci on my forks to help some with that.
05:11 * SaraanshTandonGi sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/kYKECsfVlbKWlEoLflTBAxJA >
05:17 < SaraanshTandonGi> Also where can I find the implemetation of the training procedure?
05:17 < SaraanshTandonGi> I looked in ffn_impl.hpp but there is just the call to the optimizer.
05:42 * SaraanshTandonGi sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/DRJvWOKQsNjcUkSuYYlsQlaa >
06:16 < kartikdutt18Gitt> @saraansh1999, for an example of training (on mnist), take a look [at](https://github.com/mlpack/models/blob/master/Kaggle/DigitRecognizerCNN/src/DigitRecognizerCNN.cpp)
06:16 < kartikdutt18Gitt> this.
07:00 -!- khimrajGitter[m] [gitterkhim@gateway/shell/matrix.org/x-rtcvrqkpycmiempt] has joined #mlpack
07:00 < khimrajGitter[m]> Hi @zoq @ShikharJ I want to contribute in GAN model under Essential Deep Learning Modules project. I have gone through mlpack code base and I have sufficient understanding how it works. Please can you suggest me some current issues in GAN implementation so that I can work on those before submitting proposal for GSOC 2020.
07:00 < khimrajGitter[m]> (edited) ... @zoq @ShikharJ I ... => ... @zoq @ShikharJ, I ...
07:37 < PrinceGuptaGitte> Hi @kartikdutt18 , about refactoring activation function code to remove `Fn()` and `Deriv()`, it is only for those which are implemented as a layer like `elu.hpp` right
07:37 < PrinceGuptaGitte> or is it also for those in activation_functions folder
07:38 < PrinceGuptaGitte> because ISRU is implemented as layer and SQNL like the normal ones in activation_functions folder
07:45 < jenkins-mlpack2> Project docker ensmallen nightly build build #185: ABORTED in 2 hr 30 min: http://ci.mlpack.org/job/docker%20ensmallen%20nightly%20build/185/
08:01 < kartikdutt18Gitt> Yes it is for those activation functions implemented as layers. I think I left a comment on every PR to change to the newer implementation.
08:08 < PrinceGuptaGitte> @kartikdutt18 should I remove `Inv` function as well? Because having `Inv()` makes the `Deriv()` code clear
08:08 < PrinceGuptaGitte> (edited) ... code clear => ... code clean
08:20 * SaraanshTandonGi sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/ozaGtympgZYkkBSxNLpOJggZ >
08:20 * SaraanshTandonGi sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/XxtYGKUHBUkoLMazNUuILSJO >
08:21 < SaraanshTandonGi> Also it might be helpful to see the implementation of the Train function. Can someone point me towards its location. I seem to be a bit lost. TIA
08:24 < kartikdutt18Gitt> Hi @prince776, using Fn, Deriv and Inv does make the code cleaner however implementing them directly in Forward and Backward saves space, also it becomes easy for anyone to understand the use of forward and backward function. For `inverse` function, Since it's a private member function it doesn't have much use for a user so we can implement it directly as part of Backward.
08:24 < PrinceGuptaGitte> ok, I'll do it that way
08:28 < kartikdutt18Gitt> @saraansh1999, Can I have a look at the code(you could share the link using an online ide), I am not really sure what is causing the error.
08:37 < SaraanshTandonGi> Nvm, I got it. Thanks anyways. :)
09:06 < kartikdutt18Gitt> Great.
09:16 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has joined #mlpack
10:13 < PrinceGuptaGitte> Hi @kartikdutt18, I've fixed the comment style issue of `layer_names.hpp` in PR #2243
10:16 < kartikdutt18Gitt> Great, Thanks.
10:19 -!- prudhvi-hack [0e8bb5e5@14.139.181.229] has joined #mlpack
10:22 -!- prudhvi-hack [0e8bb5e5@14.139.181.229] has quit [Remote host closed the connection]
10:49 < SaraanshTandonGi> Is there any way to trace the function calls inside the mlpack library. Like I'm calling Split function and getting a Mat error inside it somewhere. But gdb doesn't give a trace inside the function by default. It gives something like:
10:49 < SaraanshTandonGi> ``
10:49 * SaraanshTandonGi sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/hLvaSfxWmihvQtaCZiiLAmdM >
10:49 * SaraanshTandonGi sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/eGcRCZcPxvrxyugTfGwdqVrW >
11:05 * SaraanshTandonGi sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/uioTHHkypMxJxlGpXJSHLfCn >
11:05 < metahost> SaranshTandon you can step through the execution a step at at time or set breakpoints. You may find this helpful: https://dustymabe.com/2012/12/17/trace-function-calls-using-gdb-revisited/
11:05 < metahost> SaraanshTandonGi: ^
11:21 < SaraanshTandonGi> Thanks a lot!
12:17 * SaraanshTandonGi sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/drlxYRyzjPUCSmwNffhopKUM >
12:29 * SaraanshTandonGi sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/QfoYJrYXOQVaVSWqzgmpLNPq >
12:29 < SaraanshTandonGi> (edited) ... to 10000, the ... => ... to 1000, the ...
12:34 < SaraanshTandonGi> Also any suggestions as to how to approach these problems so that I don't have to ask again and again would be appreciated
13:25 < GauravSinghGitte> Hey @saraansh1999, is the value of ITERATIONS_PER_CYCLE same as the number of columns in your dataset?
14:11 -!- Omar93 [9cd0b9cd@156.208.185.205] has joined #mlpack
14:13 -!- OmarWagih1Gitter [gitterom_2@gateway/shell/matrix.org/x-rayliojgkbrlblzg] has joined #mlpack
14:13 < OmarWagih1Gitter> Hey all, just a heads up, i think the slack channel link is broken on the community page
15:28 -!- Omar93 [9cd0b9cd@156.208.185.205] has quit [Remote host closed the connection]
16:13 < SaraanshTandonGi> > Hey @saraansh1999, is the value of ITERATIONS_PER_CYCLE same as the number of columns in your dataset?
16:13 < SaraanshTandonGi> No, I have around 30000 cols
16:13 < SaraanshTandonGi> 50 per batch
16:29 < rcurtin> OmarWagih1Gitter: thanks for pointing that out, let me debug it :)
16:30 < rcurtin> fixed! the docker container simply wasn't running :)
16:51 < Nakul[m]> @rcurtin:matrix.org: would I allowed to open pr related to cmake in model as part of refactoring before GSoC.
16:52 < Nakul[m]> @rcurtin:matrix.org: am I allowed to open pr related to cmake as part of refactoring in mlpack repo. Before GSoC
16:54 -!- favre49 [75f287e2@117.242.135.226] has joined #mlpack
16:56 < favre49> Just out of question, our site footer says that we our copyright is 2007-2019. I know nothing about copyrights and licenses, why does that end in 2019?
16:57 < favre49> Also, LICENSE.txt says the copyright is 2007-2018
16:57 < favre49> I have no clue if this is something that matters at all or if i'm just being nitpicky here
17:00 -!- favre49 [75f287e2@117.242.135.226] has quit [Remote host closed the connection]
17:48 < rcurtin> favre49: I think because nobody updated it to say 2020 :)
17:48 < rcurtin> if you wanted to do that feel free! I just overlooked it
18:23 < zoq> rcurtin: Looks like the auto approval bot isn't working ... maybe another API update
18:31 -!- eadwu [265ffddf@dhcp095-253-223.wireless.buffalo.edu] has joined #mlpack
18:31 < rcurtin> blah, let me check on it
18:52 < eadwu> For GSoC, what's like the bare minimum cutoff of dumbness tolerated pre-proposal (since studying is a thing before the summer). For reinforcement learning, I haven't done any of that, the deepest I've gone in neural networks is having a simple and straightforward ANN layout whose values/weights were controlled by a genetic algorithm.
18:55 < zoq> eadwu: We all have to start somewhere, so we don't expect a student to be an expert; however a strong knowledge about the topic is definitely helpful.
19:07 < rcurtin> okay, I've fixed some errors with mlpack-bot, but it doesn't seem like it's doing the stale sweep...
19:07 < rcurtin> or the auto-approve
19:09 < rcurtin> looks like we got bitten by https://github.com/probot/stale/issues/253
19:09 < rcurtin> just a wonderful reminder of how painful upgrading things in JS world is
19:09 < rcurtin> :)
19:10 < rcurtin> let's see if that fixes it... I guess we should know in a handful of hours
19:11 < zoq> crazy
19:11 < zoq> thanks for looking into it
19:11 < zoq> so much fun everytime there is an issue with the bot
19:12 < rcurtin> yeah, really :)
19:12 < rcurtin> the only problem with automation is fixing it when everything inevitably goes wrong :)
19:13 < rcurtin> Nakul[m]: sure, I don't see any issue with it, but if you do part of your project before GSoC even starts make sure that there is still enough left in the timeline to fill the rest of the summer
19:52 < PrinceGuptaGitte> Hi @zoq , about 'summary()` function in Neural Network. I now understand the concept of serialization/de-serialization with boost thanks to the resources shared by you. But I am confused about how and why to use that in `summary()` function. We still have to manually access the member variable of the class after de-serializing, but then that could be done using the specific visitors as being done throughout the
19:52 < PrinceGuptaGitte> `FFN` or `RNN` class.
19:52 < PrinceGuptaGitte> Another thing to do would be to just get all data members through de-serialization and Log them, but that would be unnecessary amount of data, especially because member variables like `delta` are not at all useful in summarizing the model.
19:52 < PrinceGuptaGitte> (edited) ... about 'summary()` function ... => ... about `summary()` function ...
19:53 < PrinceGuptaGitte> Or am I missing something important?
19:58 < Nakul[m]> > Nakul: sure, I don't see any issue with it, but if you do part of your project before GSoC even starts make sure that there is still enough left in the timeline to fill the rest of the summer
20:01 < Nakul[m]> > Nakul: sure, I don't see any issue with it, but if you do part of your project before GSoC even starts make sure that there is still enough left in the timeline to fill the rest of the summer
20:01 < Nakul[m]> Well if work ends early I would love to work( or help if someone is already working ) on visualization my favorite one idea in idea list.
20:30 < eadwu> Is the BLAS requirement for Armadillo referring to OpenBLAS or BLAS
20:31 < eadwu> Oh ignore that question, was answered in the next few words in the README
20:55 < rcurtin> eadwu: perfect, that means we did a good job with the README if it correctly predicted your next question :)
20:57 < SaraanshTandonGi> I was trying to trace back a Train call in the codebase but I am stuck at DeterministicSetVisitor. I see that for boost is applying the visitor to each layer, but where is the implementation of the visit or accept functions?
21:00 < zoq> SaraanshTandonGi: https://github.com/mlpack/mlpack/tree/master/src/mlpack/methods/ann/visitor
21:02 < zoq> PrinceGuptaGitte: We don't have to access it, all the information is already there, maybe a first step it so serialize a model and take a look at the output, txt, xml, etc.
21:02 < SaraanshTandonGi> `boost::apply_visitor(DeterministicSetVisitor(deterministic),
21:02 < SaraanshTandonGi> layer->Model()[i]);`
21:02 < SaraanshTandonGi> I see this but where do I trace this back to?
21:03 < SaraanshTandonGi> Where is the code which calls the forward visitor
21:03 < zoq> PrinceGuptaGitte: Also you are right, not all of the data is useful, we would have to filter it, but we could also provide like weight variance for each layer etc.
21:04 < zoq> SaraanshTandonGi: You mean the FFN Train function?
21:06 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has quit [Quit: Leaving]
21:06 * SaraanshTandonGi sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/wagMjLSvbgsYDayAoyqNmImC >
21:06 < zoq> SaraanshTandonGi: haven't had time yet to take a look at you other messages, so maybe you already answered question or maybe there is a simple solution for your problem.
21:06 < SaraanshTandonGi> Now i know that the forward visitor calls the layer forward function
21:07 < SaraanshTandonGi> @zoq I want to know this irrespective of the problem
21:07 < SaraanshTandonGi> I'll try to figure out a solution on my own once I know this
21:07 < SaraanshTandonGi> > Now i know that the forward visitor calls the layer forward function
21:07 < SaraanshTandonGi> So how do we go from the deterministicsetvisitor to the forwardvisitor
21:08 < SaraanshTandonGi> Also what exactly is the point of a deterministicsetvisitor. Why isn't there a direct call to the forward visitor?
21:11 < zoq> SaraanshTandonGi: Unfortunately you can't just access/modify a std::variant/boost::variant, you have to use a vistor to do so.
21:12 * SaraanshTandonGi sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/kpWevmTDhzHgraKVhqDXfJDo >
21:13 < zoq> The vistor implementation (apply, etc. ) is part of boost.
21:15 < SaraanshTandonGi> So what exactly happens after this call
21:15 < SaraanshTandonGi> `boost::apply_visitor(DeterministicSetVisitor(deterministic), layer->Model()[i]);`
21:15 < SaraanshTandonGi> ?
21:18 < SaraanshTandonGi> Or simply how does the DeterministicSetVisitor ultimately lead to the forward call.
21:27 < SaraanshTandonGi> Any resources to help me through this?
21:31 < zoq> You can search for boost::static_visitor for more details, but at the end apply_visitor calls the DeterministicSetVisitor on one layer, which calls LayerDeterministic and that one sets the Deterministic value if that layer implements the method.
21:31 < PrinceGuptaGitte> Thanks for your input @zoq , I'll see how the serialized model looks as a string and then try to filter out the important parts, or something else depending on what it shows
21:32 < zoq> You are probably looking for ForwardVisitor, which calls tyhe Forward function of a layer if the layer implements that function.
21:32 < zoq> That is done here: https://github.com/mlpack/mlpack/blob/537484b8cc937bcab0ed0de175626e326eac30f2/src/mlpack/methods/ann/ffn_impl.hpp#L134
21:33 < zoq> FFN<OutputLayerType, InitializationRuleType, CustomLayers...>::Forward( is actually called by the optimizer.
21:33 < zoq> Which is part of ensmallen
21:33 < PrinceGuptaGitte> I'll work on it tomorrow since it's already 3 AM
21:34 < zoq> So let's say zou use a SGD based optimizer, the line you are looking for is: https://github.com/mlpack/ensmallen/blob/master/include/ensmallen_bits/sgd/sgd_impl.hpp#L145
22:23 -!- eadwu [265ffddf@dhcp095-253-223.wireless.buffalo.edu] has quit [Remote host closed the connection]
22:54 -!- travis-ci [~travis-ci@ec2-34-201-147-62.compute-1.amazonaws.com] has joined #mlpack
22:54 < travis-ci> shrit/ensmallen#10 (early_stopping - 444213d : Omar Shrit): The build has errored.
22:54 < travis-ci> Change view : https://github.com/shrit/ensmallen/compare/65e80962b3b9...444213d525d7
22:54 < travis-ci> Build details : https://travis-ci.com/shrit/ensmallen/builds/151204575
22:54 -!- travis-ci [~travis-ci@ec2-34-201-147-62.compute-1.amazonaws.com] has left #mlpack []
22:58 -!- travis-ci [~travis-ci@ec2-34-201-147-62.compute-1.amazonaws.com] has joined #mlpack
22:58 < travis-ci> shrit/ensmallen#11 (early_stopping - 89f7947 : Omar Shrit): The build has errored.
22:58 < travis-ci> Change view : https://github.com/shrit/ensmallen/compare/444213d525d7...89f794768fb8
22:58 < travis-ci> Build details : https://travis-ci.com/shrit/ensmallen/builds/151204683
22:58 -!- travis-ci [~travis-ci@ec2-34-201-147-62.compute-1.amazonaws.com] has left #mlpack []
23:04 -!- travis-ci [~travis-ci@ec2-54-164-1-155.compute-1.amazonaws.com] has joined #mlpack
23:04 < travis-ci> shrit/models#6 (digit - cff9a35 : Omar Shrit): The build is still failing.
23:04 < travis-ci> Change view : https://github.com/shrit/models/compare/49550486a60c...cff9a3590404
23:04 < travis-ci> Build details : https://travis-ci.com/shrit/models/builds/151204810
23:04 -!- travis-ci [~travis-ci@ec2-54-164-1-155.compute-1.amazonaws.com] has left #mlpack []
23:20 -!- travis-ci [~travis-ci@ec2-54-164-1-155.compute-1.amazonaws.com] has joined #mlpack
23:20 < travis-ci> shrit/models#7 (digit - 7cb1366 : Omar Shrit): The build is still failing.
23:20 < travis-ci> Change view : https://github.com/shrit/models/compare/cff9a3590404...7cb1366e82d7
23:20 < travis-ci> Build details : https://travis-ci.com/shrit/models/builds/151205385
23:20 -!- travis-ci [~travis-ci@ec2-54-164-1-155.compute-1.amazonaws.com] has left #mlpack []
23:23 -!- eadwu [450c1671@dhcp012-022-113.wireless.buffalo.edu] has joined #mlpack
--- Log closed Sun Mar 01 00:00:09 2020