mlpack IRC logs, 2020-03-21

Logs for the day 2020-03-21 (starts at 0:00 UTC) are shown below.

March 2020
Sun
Mon
Tue
Wed
Thu
Fri
Sat
1
2
3
4
5
6
7
8
9
10
11
12
13
21
22
23
24
25
26
27
--- Log opened Sat Mar 21 00:00:37 2020
05:51 -!- favre49 [~favre49@106.51.108.60] has joined #mlpack
05:53 < favre49> I think it got buried in the chat earlier, so I'll ask again - If I'm not wrong, VAE will be in the models repository, while LSTM and MNIST will be in examples?
06:03 -!- favre49 [~favre49@106.51.108.60] has quit [Ping timeout: 264 seconds]
06:04 -!- favre49 [~favre49@106.51.28.212] has joined #mlpack
06:16 -!- favre49 [~favre49@106.51.28.212] has quit [Ping timeout: 264 seconds]
06:18 -!- favre49 [~favre49@49.207.58.225] has joined #mlpack
06:21 < kartikdutt18Gitt> I think I might have missed that message.
06:24 < kartikdutt18Gitt> Ohh, Found it. So I think we can have LeNet and AlexNet that I made in models repo and a data loader which supports popular datasets like mnist (currently), later pascal VOC etc.We can also add VAE (both) models and we can remove it from examples repo. What do you think?
07:02 < jeffin143[m]> > i usually just copy the code between the BOOST_AUTO_TEST_CASE(whateveralgorithm){ /code here/ }, change the "Log::Debug" to "std::cout", and paste it into the int main{ }, it works
07:02 < jeffin143[m]> Yes , that's one way of learning along with tooo many print statements
07:07 < favre49> kartikdutt18Gitt: Yeah that's what I was thinking as well. Good to be on the same page. I guess I'll make a PR that simplifies the examples repo down to DigitRecognizer and LSTM at some point
07:14 < kartikdutt18Gitt> @favre49, I will make the PR for models today. I already have an issue open in examples repo for this. Here is the [link](https://github.com/mlpack/examples/issues/66). I think after [#56](https://github.com/mlpack/examples/pull/56) and [#55](https://github.com/mlpack/examples/pull/55) are merged, the next thing would be solving issue [#65](https://github.com/mlpack/examples/issues/65) and simplifying repo to remove
07:14 < kartikdutt18Gitt> CMake. I think this plan will at least make examples repo ready. What do you think?
07:38 < jenkins-mlpack2> Project docker mlpack nightly build build #648: NOW UNSTABLE in 3 hr 24 min: http://ci.mlpack.org/job/docker%20mlpack%20nightly%20build/648/
07:59 < favre49> kartikdutt18Gitt: Ah you're right, forgot about those. I'll wait till those PRs are merged before making any additional PRs
08:04 < bisakh[m]1> How to perform validation on test cases locally? It gets build successfully to make command though having some errors.
08:13 < kartikdutt18Gitt> Hi bisakh, If you mean tests then you can run ./bin/mlpack_test -t TESTNAME eg. ./bin/mlpack_test -t ANNLayerTest
09:57 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has joined #mlpack
10:14 < bisakh[m]1> Got it, Thanks.
10:58 -!- metahost [~metahost@irc.sayan.page] has quit [Quit: (getting back up soon)]
10:59 -!- metahost [~metahost@irc.sayan.page] has joined #mlpack
10:59 -!- metahost [~metahost@irc.sayan.page] has quit [Client Quit]
10:59 -!- metahost [~metahost@irc.sayan.page] has joined #mlpack
11:08 -!- mtnshh [73600053@115.96.0.83] has joined #mlpack
11:11 * hemal[m] sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/zliGLQRZzQlFlfvUbRLTsVqp >
11:12 < hemal[m]> Anyone else who knows armadillo and sparse matrices, please help me out here.
11:16 -!- mtnshh [73600053@115.96.0.83] has quit [Ping timeout: 240 seconds]
11:22 < LakshyaOjhaGitte> hey any mentor online?
11:31 -!- jenkins-mlpack2 [~PircBotx@knife.lugatgt.org] has quit [Ping timeout: 258 seconds]
11:31 -!- rcurtin [~ryan@knife.lugatgt.org] has quit [Ping timeout: 265 seconds]
11:56 -!- rcurtin [~ryan@knife.lugatgt.org] has joined #mlpack
11:56 -!- Topic for #mlpack: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
11:56 -!- Topic set by rcurtin [] [Mon Nov 12 22:39:13 2018]
11:56 [Users #mlpack]
11:56 [ aadarsh-asthanaG] [ EL-SHREIFGitter[] [ kartikdutt18Gitt] [ Nakul[m] ] [ riyouk[m] ] [ TaapasAgrawalGit]
11:56 [ abernauer[m] ] [ favre49 ] [ khimrajGitter[m]] [ naruarjun[m] ] [ robertohueso ] [ TanayMehtaGitter]
11:56 [ AbhinavvermaGitt] [ GarvTambiGitter[] [ KhizirSiddiquiGi] [ NishaGeorgeGitte] [ robotcatorGitter] [ TanviAgarwalGitt]
11:56 [ AbishaiEbenezerG] [ GauravSinghGitte] [ KimSangYeon-DGU[] [ nishantkr18[m] ] [ RohitKartikGitte] [ tejasvi[m] ]
11:56 [ AmeetKumarRanaGi] [ geek-2002Gitter[] [ kritika12298Gitt] [ NishantKumarGitt] [ RoHitRushilGitte] [ TrinhNgo[m] ]
11:56 [ AniThoGitter[m] ] [ gotadachi ] [ kuhaku ] [ ocelaiwo[m] ] [ RudraPatil[m] ] [ UmarGitter[m] ]
11:56 [ AnjishnuGitter[m] [ gtank___ ] [ kunal12298Gitter] [ OmarWagih1Gitter] [ ryan[m] ] [ Valliappan_CAGit]
11:56 [ AryamanBhagatGit] [ harshitaarya[m] ] [ kyrre ] [ outmanipulateGit] [ saksham189Gitter] [ vansika__ ]
11:56 [ ayush29[m] ] [ hemal[m] ] [ LakshyaOjhaGitte] [ Param-29Gitter[m] [ Saksham[m] ] [ vigsterkr[m] ]
11:56 [ azwn[m] ] [ himanshu_pathak[] [ lozhnikov ] [ petris ] [ SakshamRastogiGi] [ VSaicharanGitter]
11:56 [ benpa[m] ] [ HimanshuPathakGi] [ M_slack_18 ] [ pickle-rick[m] ] [ SaraanshTandonGi] [ zalava[m] ]
11:56 [ bhanukumarGitter] [ ImQ009 ] [ M_slack_mlpack13] [ PranavReddyP16Gi] [ Shikhar-SGitter[] [ zoq ]
11:56 [ birm[m]1 ] [ incrypt0 ] [ Manav-KumarGitte] [ PrinceGuptaGitte] [ shikharj[m] ] [ zoq[m] ]
11:56 [ bisakh[m]1 ] [ jacob-earleGitte] [ metahost ] [ PulkitgeraGitter] [ ShikharJaiswalGi]
11:56 [ bkb181[m] ] [ jeffin143[m] ] [ mlozhnikov[m]1 ] [ rahulverma7788Gi] [ shrit[m] ]
11:56 [ Cadair ] [ jenkins-mlpack2 ] [ mohona[m] ] [ rcurtin ] [ siddhant2001Gitt]
11:56 [ chopper_inbound4] [ JoelJosephGitter] [ MostafaNabiehGit] [ rcurtin[m] ] [ sreenik[m] ]
11:56 [ chopper_inbound[] [ johnsoncarl[m] ] [ MrityunjayTripat] [ RishabhGoel[m] ] [ SriramSKGitter[m]
11:56 -!- Irssi: #mlpack: Total of 103 nicks [0 ops, 0 halfops, 0 voices, 103 normal]
11:56 -!- Channel #mlpack created Tue Oct 11 18:35:40 2011
11:56 -!- Home page for #mlpack: http://www.mlpack.org
11:56 < kartikdutt18Gitt> Hey zoq, Can I get your opinion on this,
11:56 < kartikdutt18Gitt> > I think it got buried in the chat earlier, so I'll ask again - If I'm not wrong, VAE will be in the models repository, while LSTM and MNIST will be in examples?
11:56 -!- Irssi: Join to #mlpack was synced in 24 secs
11:57 < zoq> LakshyaOjhaGitte: I have an implementation for the paper using mlpack, I can push it somewhere if you think that is helpful.
11:58 < Param-29Gitter[m>
11:58 < Param-29Gitter[m> > `zoq on Freenode` There is no easy answer, it's a challenging problem, since you have to be familiar with the method and the implementation. And often you start with something and realize this isn't going in the right direction, so it's time-intensive as well.
11:58 < Param-29Gitter[m> Exactly 😅. I had a work around for this. I thought of considering some algorithms which are less dependent on OpenBLAS (KNN and Decision trees.) and along with these we could have a set of algorithms on which I can try to work and see if I could improve their performance (for GSOC 2020).
11:59 < zoq> kartikdutt18Gitt: I guess it could work in both, but it has to be modifed for sure to make it fit, but if you think it should go into the models repo fine with me.
12:01 < zoq> Param-29Gitter[m: Agreed, so I guess ideally you like to come up with a list of potential candidates?
12:02 < Param-29Gitter[m> Yes something like that.
12:02 < zoq> Param-29Gitter[m: Okay, will think about it and get back to you later.
12:02 < kartikdutt18Gitt> Agreed, I mentioned the same in issue mlpack/examples#66 incase anyone has something else in mind, and Yes It would have to be changed to fit in both the repos. Thanks a lot.
12:02 < GauravSinghGitte> Hi, @zoq in the formulation of `sigma` for the next population in `CMAES` implementation [here](https://github.com/mlpack/ensmallen/blob/master/include/ensmallen_bits/cmaes/cmaes_impl.hpp#L219) it can be seen that power of `0.3` has been computed after doing the computation of exponentiation but when referred to a tutorial paper on `CMAES`or the Wikipedia page [here](https://en.wikipedia.org/wiki/CMA-ES) of the
12:02 < GauravSinghGitte> algorithm only exponentiation has been done. Why `std::power` computation has been done here?
12:03 < Param-29Gitter[m> > `zoq on Freenode` Param-29 (Gitter): Okay, will think about it and get back to you later.
12:03 < Param-29Gitter[m> Ya... I'll be waiting for your reply :)
12:04 < Param-29Gitter[m> Ya.... I'll be waiting for your reply :)
12:04 < Param-29Gitter[m> > `zoq on Freenode` Param-29 (Gitter): Okay, will think about it and get back to you later.
12:04 < GauravSinghGitte> (edited) ... done here? => ... done in the implementation?
12:07 < zoq> GauravSinghGitte: Ohh, I think this is a bug.
12:11 < GauravSinghGitte> @zoq So, should I open a PR regarding it?
12:11 < zoq> GauravSinghGitte: That would be great.
12:20 < AnjishnuGitter[m> Hi @zoq. PR #2197 adds bce with logits loss. As per [this](https://discuss.pytorch.org/t/what-is-the-difference-between-bcewithlogitsloss-and-multilabelsoftmarginloss/14944) and [this](https://discuss.pytorch.org/t/is-there-an-example-for-multi-class-multilabel-classification-in-pytorch/53579/10), Multi Label Soft Margin loss returns the same value as bce with logits loss upto an epsilon value of equality. However,
12:20 < AnjishnuGitter[m> they are conceptually different with respect to when each one is to be applied. So, I wanted to add Multi Label soft margin loss in a separate PR. Should I go ahead with that?
12:20 -!- jenkins-mlpack2 [~PircBotx@knife.lugatgt.org] has quit [Ping timeout: 264 seconds]
12:20 < zoq> AnjishnuGitter[m: Yeah, I think that is a good idea.
12:20 -!- rcurtin [~ryan@knife.lugatgt.org] has quit [Ping timeout: 240 seconds]
12:46 -!- rcurtin [~ryan@knife.lugatgt.org] has joined #mlpack
12:46 -!- Topic for #mlpack: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
12:46 -!- Topic set by rcurtin [] [Mon Nov 12 22:39:13 2018]
12:46 [Users #mlpack]
12:46 [ aadarsh-asthanaG] [ EL-SHREIFGitter[] [ kartikdutt18Gitt] [ Nakul[m] ] [ riyouk[m] ] [ TaapasAgrawalGit]
12:46 [ abernauer[m] ] [ favre49 ] [ khimrajGitter[m]] [ naruarjun[m] ] [ robertohueso ] [ TanayMehtaGitter]
12:46 [ AbhinavvermaGitt] [ GarvTambiGitter[] [ KhizirSiddiquiGi] [ NishaGeorgeGitte] [ robotcatorGitter] [ TanviAgarwalGitt]
12:46 [ AbishaiEbenezerG] [ GauravSinghGitte] [ KimSangYeon-DGU[] [ nishantkr18[m] ] [ RohitKartikGitte] [ tejasvi[m] ]
12:46 [ AmeetKumarRanaGi] [ geek-2002Gitter[] [ kritika12298Gitt] [ NishantKumarGitt] [ RoHitRushilGitte] [ TrinhNgo[m] ]
12:46 [ AniThoGitter[m] ] [ gotadachi ] [ kuhaku ] [ ocelaiwo[m] ] [ RudraPatil[m] ] [ UmarGitter[m] ]
12:46 [ AnjishnuGitter[m] [ gtank___ ] [ kunal12298Gitter] [ OmarWagih1Gitter] [ ryan[m] ] [ Valliappan_CAGit]
12:46 [ AryamanBhagatGit] [ harshitaarya[m] ] [ kyrre ] [ outmanipulateGit] [ saksham189Gitter] [ vansika__ ]
12:46 [ ayush29[m] ] [ hemal[m] ] [ LakshyaOjhaGitte] [ Param-29Gitter[m] [ Saksham[m] ] [ vigsterkr[m] ]
12:46 [ azwn[m] ] [ himanshu_pathak[] [ lozhnikov ] [ petris ] [ SakshamRastogiGi] [ VSaicharanGitter]
12:46 [ benpa[m] ] [ HimanshuPathakGi] [ M_slack_18 ] [ pickle-rick[m] ] [ SaraanshTandonGi] [ zalava[m] ]
12:46 [ bhanukumarGitter] [ ImQ009 ] [ M_slack_mlpack13] [ PranavReddyP16Gi] [ Shikhar-SGitter[] [ zoq ]
12:46 [ birm[m]1 ] [ incrypt0 ] [ Manav-KumarGitte] [ PrinceGuptaGitte] [ shikharj[m] ] [ zoq[m] ]
12:46 [ bisakh[m]1 ] [ jacob-earleGitte] [ metahost ] [ PulkitgeraGitter] [ ShikharJaiswalGi]
12:46 [ bkb181[m] ] [ jeffin143[m] ] [ mlozhnikov[m]1 ] [ rahulverma7788Gi] [ shrit[m] ]
12:46 [ Cadair ] [ jenkins-mlpack2 ] [ mohona[m] ] [ rcurtin ] [ siddhant2001Gitt]
12:46 [ chopper_inbound4] [ JoelJosephGitter] [ MostafaNabiehGit] [ rcurtin[m] ] [ sreenik[m] ]
12:46 [ chopper_inbound[] [ johnsoncarl[m] ] [ MrityunjayTripat] [ RishabhGoel[m] ] [ SriramSKGitter[m]
12:46 -!- Irssi: #mlpack: Total of 103 nicks [0 ops, 0 halfops, 0 voices, 103 normal]
12:46 -!- Channel #mlpack created Tue Oct 11 18:35:40 2011
12:46 -!- Home page for #mlpack: http://www.mlpack.org
12:46 -!- Irssi: Join to #mlpack was synced in 24 secs
13:15 < AnjishnuGitter[m> I was trying to build my local fork of mlpack. But I ran into an error when building and then the bash exits with code 2. Now coming to the error itself, it is occuring in some of the files like output_width_visitor_impl and other such files which I haven’t modified. The same error message can also be seen
13:15 < AnjishnuGitter[m> [here](https://dev.azure.com/mlpack/mlpack/_build/results?buildId=1098&view=logs&j=24d3abe3-ef0b-5deb-3aab-64d839de2c3c&t=8be94158-0791-5881-12e7-451b62b18296) in a build on a PR I had previously submitted. How do I proceed with this? Because I presume I would need to build locally to test some layers I am working on, but I can’t now. @zoq
13:17 < zoq> AnjishnuGitter[m: Did you merge the current master branch?
13:17 < AnjishnuGitter[m> Yep. everything is up to date as far as I can see.
13:18 < zoq> AnjishnuGitter[m: Do you have a link to the PR?
13:18 < AnjishnuGitter[m> https://github.com/mlpack/mlpack/pull/2307
13:20 < bisakh[m]1> zoq across all PR, all macOS and windows builds are failing due to ensmallen download error
13:21 < zoq> AnjishnuGitter[m: Let me comment on the PR.
13:21 < AnjishnuGitter[m> Okay!
13:25 < zoq> bisakh[m]1: Do you have an example PR?
13:25 < zoq> bisakh[m]1: Not sure I see the issue.
13:26 < AnjishnuGitter[m> @zoq I think the error pointed to by @bisakh can be seen from the PR I mentioned above as well.
13:26 < bisakh[m]1> zoq: Yes few minutes back i push updates to the PR #2274
13:27 < bisakh[m]1> <https://github.com/mlpack/mlpack/pull/2274>
13:28 < zoq> Ahh I see, looks like we have some connection issues.
13:29 < bisakh[m]1> yeah it seems so.
13:30 < bisakh[m]1> Hey zoq, I was thinking as WGAN, WGAN-GP is already implemented, PacGAN is going to be implemented, what about implementing image-to-image translation (pix2pix) in mlpack along with DeepLab-v3 with tests and Documentations in this summer on the idea Application of ANN?
13:30 < zoq> I think this is a temporary issue, so let's wait some hours.
13:31 < zoq> bisakh[m]1: Sounds like a great idea to me.
13:32 < bisakh[m]1> So can I make a proposal on this ?
13:32 < zoq> bisakh[m]1: Sure, feel free.
13:32 < bisakh[m]1> if time permits we can cover 1-2 small topics also in this.
13:33 < zoq> bisakh[m]1: Hm, I think this is already a big enough project, but yeah if there is time left always nice to have something that could be added on top.
13:34 < bisakh[m]1> zoq: okay! It would be great if you point out what points should be covered in proposal
13:35 < zoq> bisakh[m]1: https://github.com/mlpack/mlpack/wiki/Google-Summer-of-Code-Application-Guide should be helpful.
13:35 < bisakh[m]1> like idea explanation...
13:35 < Saksham[m]> Hey zoq SimpleAdaDeltaTestFunctionFMat is failing on my system. I’ve opened up an issue?
13:36 < Saksham[m]> Can you have a look?
13:38 < bisakh[m]1> zoq thanks
13:40 < zoq> Saksham[m]: Just commented on the issue.
14:08 -!- john91 [9d27aae4@157.39.170.228] has joined #mlpack
14:08 -!- john91 [9d27aae4@157.39.170.228] has quit [Remote host closed the connection]
15:02 < favre49> zoq: rcurtin: Is setting up azure pipelines something only you guys can do? If not, I wanted to try moving our style checks and stuff over to Azure so we can ditch Travis entirely
15:03 < zoq> favre49: For the models repo?
15:03 -!- lozhnikov [~mikhail@lozhnikov.static.corbina.ru] has quit [Quit: ZNC 1.7.5 - https://znc.in]
15:04 < favre49> zoq: Ensmallen and mlpack as well
15:05 < favre49> Oh right Contributors currently don't have access to approvals and stuff on models and examples, not sure if that's on purpose
15:05 < zoq> favre49: We don't use travis for style checks on the mlpack repo.
15:05 < zoq> favre49: That's a jenkins job.
15:05 < favre49> Oops yeah, you're right
15:06 < favre49> Wouldn't it be simpler for everything to be on azure pipelines though? I assume we have to pay for AWS
15:06 < zoq> favre49: I don't think azure provides a nice interface to show style issues.
15:06 < zoq> favre49: No, it's free for open source projects.
15:06 < favre49> Oh okay then I suppose it doesn't matter
15:11 < SriramSKGitter[m> What functionality does Appveyor offer that we're still using it?
15:12 < zoq> SriramSKGitter[m: It builds the msi installer package, that could be done in azure as well, but I haven't had time to debug this one.
15:14 < SriramSKGitter[m> Ah I see. Only asked because far too many of my builds have failed after exceeding build time on Appveyor :)
15:14 < zoq> SriramSKGitter[m: yeah, you can ignore those issues.
16:16 < PrinceGuptaGitte> Hi, I'm working on Inceptionv1 and it uses padded pooling layers, which is being implemented in #2318. So should I wait for it to be merged or locally implement it(but then when I'll open PR, it'll cause problems)? Any ideas what I should do?
16:18 < PrinceGuptaGitte> (edited) ... in #2318. So ... => ... in #2127. So ...
16:30 < zoq> PrinceGuptaGitte: You can use git cherry pick.
16:37 < PrinceGuptaGitte> Thanks :)
16:59 < AnjishnuGitter[m> So, since I was kind of new to git a few weeks back, I made probably the most basic mistake possible, by editing directly on the master branch of my fork when creating #2307 . Coming to present time, I am working on a different feature locally, but it just struck me that if I create a branch from my master, then the changes from #2307 will be reflected in this branch, which shouldn’t be the case. As far as I can
16:59 < AnjishnuGitter[m> guess, I have 2 options. Either delete my fork and re-open a different PR with same changes as #2307 and a separate PR for my new feature or keep #2307 as it is for the moment till it is merged and for the new feature, create a branch from my master and remove the commits corresponding to #2307 from this branch. However this second option doesn’t really feel ideal. What should I be doing in this case? 🙈 @zoq
17:00 < zoq> AnjishnuGitter[m: I guess I would go with option one.
17:01 -!- togo [~togo@2a02:6d40:34db:fd01:b5ea:2c52:90be:51ed] has joined #mlpack
17:01 < AnjishnuGitter[m> Alright! Let’s nuke everything :)
17:39 < PranavReddyP16Gi> @iamshnoo something similar happened to me have you tried git reset --hard? Here's a good link I found in case you need further clarification : https://opensource.com/article/18/6/git-reset-revert-rebase-commands
17:39 < PranavReddyP16Gi> I hope you read this before nuking everything 😅
17:41 < AnjishnuGitter[m> Thanks for that. I will have a look through the link. Should help to avoid future mistakes.
17:44 -!- favre49 [~favre49@49.207.58.225] has quit [Quit: Lost terminal]
18:18 < LakshyaOjhaGitte> Hi @iamshnoo I think you can use git revert, it will give you all the commits you did ever and you can go back to that commit from whatever you are on now.
18:19 < LakshyaOjhaGitte> Will help if you remember which commit is the one you need right now. It's like you can go back in time for the branch you want this for.
18:28 -!- AbdullahKhilji[m [slackml_37@gateway/shell/matrix.org/x-rnpkerpjmwksshhg] has joined #mlpack
18:28 < AbdullahKhilji[m> Hi, I am Abdullah Khilji. Looking forward positively to join mlpack for GSoC this summer.
18:30 < AbdullahKhilji[m> Interested in the Reinforcement learning project, am I too late to join here?
18:33 < zoq> AbdullahKhilji[m: Hello, no you are not too late.
18:36 -!- toluschr [~toluschr@xdsl-85-197-63-110.nc.de] has joined #mlpack
18:38 < Param-29Gitter[m> Hey @zoq, do think i should close #2286(since I am not getting any speed up when compiled with OpenMP)?. Also I would love to have your views on #2315.
18:41 < AbdullahKhilji[m> For the introduction, I am a 3rd year CSE student at NIT Silchar and an undergrad researcher at AI Lab at my institute. Have written 4-5 research papers (under review) thus, passionate about Research especially the Reinforcement domain. Many of my research projects are also based on the Natural Language domain. But have a strong willingness to explore new avenues in RL. I have read the mlpack wiki, could anyone let
18:41 < AbdullahKhilji[m> me know what my initial steps must be in order to proceed further.
18:41 -!- toluschr [~toluschr@xdsl-85-197-63-110.nc.de] has quit [Quit: toluschr]
18:41 < zoq> AbdullahKhilji[m: Sounds good, https://www.mlpack.org/gsoc.html should be helpful.
18:42 < Param-29Gitter[m> Hey @zoq, Should I close #2286 (since I am not getting any speed-up when compiled with OpenBLAS) ? Also I would love hear your views on #2315.
18:42 < zoq> Param-29Gitter[m: Hm, I guess you are right, I would have to take a closer look, run some tests as well, but don't think I can do that in the next days.
18:44 < Param-29Gitter[m> I tired running many different tests for #2286, but it works best when compiled with OpenBLAS without OpenMP. :)
18:45 < zoq> Param-29Gitter[m: In this case, let's close the PR.
18:48 < Param-29Gitter[m> I'll try doing profiling of some more algorithms. I guess that would help me decide which algorithms to consider for parallelizing.
21:03 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has quit [Quit: Leaving]
22:50 -!- M_slack_mlpack16 [slackml_38@gateway/shell/matrix.org/x-ksugzzrcfpegzjgn] has joined #mlpack
22:52 -!- M_slack_mlpack16 is now known as JatoJoseph[m]
22:52 < JatoJoseph[m]> Hello, I wish to work on the project *"*Application of ANN Algorithms Implemented in mlpack"
22:52 < JatoJoseph[m]> <JatoJoseph[m] "Hello, I wish to work on the pro"> Where do i start ls
22:53 < zoq> JatoJoseph[m]: Hello, mlpack.org/gsoc.html should be helpful.
23:08 < rcurtin> zoq: I think I figured out the convolutional network issue... I spent quite a while checking the math and everything, then realized basically all we have to do is use the `input` parameter provided by the Gradient() method instead of making an alias to the one we got in `Forward()` :)
23:10 < rcurtin> (I'm referring to the issue that's solved by the PR where I made a copy, but didn't merge)
23:10 < zoq> rcurtin: time intensive debug session -> simple fix, I guess we can be happy about the outcome.
23:10 < rcurtin> yeah, agreed!
23:11 < rcurtin> but I have been playing with -fsanitize=address and working my way through mlpack_test
23:11 < rcurtin> I debugged all the ANN layers, so all those tests pass now
23:11 < rcurtin> now at KDETest... so I am making progress through the alphabet :)
23:11 < zoq> wow, nice
23:11 < zoq> :)
23:11 < rcurtin> I really hope to finish today or tomorrow, it's basically most of what I'm doing today
23:11 < rcurtin> slow debugging cycle though---compiling with -fsanitize=address seems to take significantly longer
23:12 < zoq> even with multiple cores?
23:12 < rcurtin> yeah, it uses way more RAM too so I have to be conservative and only compile with 2 or 3 cores
23:12 < rcurtin> (otherwise the system swaps and my music stops :-D)
23:12 < zoq> :D
23:13 < rcurtin> there is one problem I'm not solving that I know about though---the copy constructors for all of the layers are the default copy constructors
23:13 < rcurtin> but this will copy the members like `inputParameter`, `outputParameter`, etc. which aren't meant to be copied
23:13 < rcurtin> this is what results in the problem reported in #2314
23:13 < zoq> right, I could handle that one, or we open an issue
23:14 < rcurtin> so my PR won't address that. it might be easy to put together a quick workaround, but to really solve it right, all the layers would need a copy constructor
23:14 < rcurtin> I think opening an issue would be fine---this is the type of thing where it's actually a pretty good task for people looking to get involved :)
23:14 < rcurtin> I only wish we had noticed and opened the issue in February :)
23:15 < zoq> Pretty sure someone will pick this up in a couple of hours.
23:15 < zoq> But yeah, it's a nice entrance task.
23:20 < rcurtin> the only thing is, @ilya-pchelintsev already said on #2314 that he'd like to fix it, but hasn't responded yet
23:21 < rcurtin> in any case, maybe he'll respond in the next day or two and we'll see what the best way forward is
23:21 < zoq> yeah, maybe he is busy with some other stuff
23:31 < rcurtin> made it all the way to RecurrentNetworkTest, most of the way through the alphabet :)
23:31 < rcurtin> I'm scared of SerializationTest and TreeTest though...
23:31 < rcurtin> oh wait, I think it's not 100% alphabetical... it hasn't done any of the main tests yet
23:32 < rcurtin> I used to think it was okay if those leaked a little memory because it would be reclaimed when the program exited, but that only applies for command-line bindings; a memory leak would actually be a problem in a Python or Julia session
23:32 < rcurtin> so I guess they all have to be debugged :)
23:33 < zoq> a little memory leak :)
23:33 < zoq> I have to pay more attention to the memory check job.
--- Log closed Sun Mar 22 00:00:39 2020