mlpack IRC logs, 2020-03-16

Logs for the day 2020-03-16 (starts at 0:00 UTC) are shown below.

March 2020
--- Log opened Mon Mar 16 00:00:30 2020
00:14 -!- togo [~togo@2a02:6d40:34db:fd01:6892:ca85:d8de:cf7b] has quit [Ping timeout: 272 seconds]
03:57 -!- gio4 [beeccef6@] has joined #mlpack
04:00 -!- gio4 [beeccef6@] has quit [Remote host closed the connection]
07:06 < jenkins-mlpack2> Project docker mlpack nightly build build #642: STILL FAILING in 2 hr 52 min:
08:43 -!- wayase47 [9d21ab9a@] has joined #mlpack
08:59 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has joined #mlpack
09:37 -!- wayase47 [9d21ab9a@] has quit [Remote host closed the connection]
10:39 -!- favre49 [6a331a9d@] has joined #mlpack
10:40 -!- favre49 [6a331a9d@] has quit [Remote host closed the connection]
11:38 -!- witness [uid10044@gateway/web/] has joined #mlpack
12:24 < PrinceGuptaGitte> This documentation still uses r value references.
12:28 < rcurtin> PrinceGuptaGitte: take a look at the URL, it's for mlpack 3.1.0, which does use rvalue references
12:28 < PrinceGuptaGitte> Oh right, it hasn't been released yet.
12:28 < rcurtin> the mlpack-git version has the changes:
12:28 < rcurtin> :)
12:28 < PrinceGuptaGitte> wow, took you no time no notice.
12:30 < jenkins-mlpack2> Project docker mlpack weekly build build #96: ABORTED in 22 days:
12:30 < jenkins-mlpack2> * Ryan Curtin: Rename files and mark +x.
12:30 < jenkins-mlpack2> * Ryan Curtin: Add first attempt at ensmallen memory check script.
12:43 < rcurtin> favre49: I was trying to figure out the documentation issue you were talking about a few days ago:
12:43 < rcurtin> "It seems that the function definition is not showing at"
12:43 < rcurtin> but I can't seem to figure out what you were referring to; to me I can see all the definitions of f(x), f'(x), and f^-1(x)
12:44 < rcurtin> maybe the page has changed over the weekend or something when it got rebuilt? anyway, let me know if there is still an issue
13:11 < vigsterkr[m]> zoq: rcurtin around?
13:18 < jenkins-mlpack2> Project docker mlpack monthly build build #16: ABORTED in 22 days:
13:18 < jenkins-mlpack2> * Marcus Edel: Convert HRF to JUNIT format.
13:18 < jenkins-mlpack2> * Marcus Edel: Raise exception if the format is unknown.
13:18 < jenkins-mlpack2> * birm: jenkins server name changed
13:18 < jenkins-mlpack2> * noreply: Update
13:18 < jenkins-mlpack2> * birm: documentation specific for this registry
13:18 < jenkins-mlpack2> * Marcus Edel: Use the actual error message instead of a default one.
13:18 < jenkins-mlpack2> * Ryan Curtin: Rename files and mark +x.
13:18 < jenkins-mlpack2> * Ryan Curtin: Add first attempt at ensmallen memory check script.
13:19 < rcurtin> vigsterkr[m]: yep, I'm here, just digging out from the weekend :)
13:20 < rcurtin> robertohueso: just scrolled back and saw your message, the situation seems tough in Spain, stay safe! (and sane, if you're stuck inside all day :-O)
13:23 < vigsterkr[m]> rcurtin: what's your status of azure pipelines?
13:24 < vigsterkr[m]> i think we are dead since last friday :(
13:25 -!- zoq[m] [slackmlp4@gateway/shell/] has joined #mlpack
13:25 < zoq[m]> <|>
13:25 < vigsterkr[m]> zoq: so basically yaml based things are dead since friday :D
13:25 < vigsterkr[m]> amazing
13:26 < zoq[m]> There is a quickfix.
13:26 < vigsterkr[m]> ah i see
13:26 < vigsterkr[m]> seriously :D
13:26 < vigsterkr[m]> :>>>
13:27 < rcurtin> yeah, I'm surprised, it feels like everything is breaking on Github lately
13:27 < rcurtin> what's the quickfix?
13:28 < vigsterkr[m]>
13:28 < rcurtin> set the region to India or Australia where issues aren't being reported? :)
13:28 < vigsterkr[m]> basically explicitly whitelist everything
13:28 < vigsterkr[m]> :))))
13:28 < rcurtin> ohh, it says right there. I did not read closely enough
13:28 < rcurtin> thank you :)
13:28 < vigsterkr[m]> which literally contradicts their docs
13:28 < vigsterkr[m]> of course
13:28 < vigsterkr[m]> ;)
13:28 < vigsterkr[m]> as that should be the default behaviour :P
13:28 < vigsterkr[m]> zoq: thnx for the link!
13:31 < Saksham[m]> Hi Ryan Curtin, I was interested in the CMA-ES project. I have gone through the original paper and tutorial and through the codebase. Can you provide some more details i could include in my proposal ? Also does the project have to be about CMA based strategies , i read about natural evolution strategies (PEPG) and it seemed interesting to me.
13:31 < rcurtin> zoq[m]: do you think we should open a PR with the workaround, and then ask everyone who is having trouble to merge master into their branch once the workaround is committed?
13:32 < rcurtin> Saksham[m]: I don't know much about the CMA-ES project, unfortunately, so I can't give much input about it that would be more helpful than anything you might have already read :(
13:34 < Saksham[m]> Thanks a lot !, i see zoq is listed as a potential mentor , he might me able to give me a better idea
13:35 < Saksham[m]> Hi zoq can you go through my previous message about CMA-ES based optimizers and help me a little
13:58 -!- favre49 [6a331a9d@] has joined #mlpack
14:00 -!- favre49 [6a331a9d@] has quit [Remote host closed the connection]
14:03 -!- favre49 [6a331a9d@] has joined #mlpack
14:05 -!- favre49 [6a331a9d@] has quit [Remote host closed the connection]
14:11 -!- favre49 [6a331a9d@] has joined #mlpack
14:11 < favre49> rcurtin You're right, I figured out what was up but it's weird
14:12 < favre49> when I open it on a half-width window, it doesn't show. But it does on a full width window
14:12 < favre49> And if i switch from full width to half width
14:21 < rcurtin> oh, interesting, let's see if I can reproduce it
14:22 < rcurtin> huh, can't seem to reproduce... stretched my window width from ~400px (small enough for the mobile version) to 6400px (I have a big resolution :))
14:23 < rcurtin> the formulas seemed to display just fine the full time
14:35 -!- toluschr [] has joined #mlpack
14:38 < toluschr> I'm having issues compiling mlpack in a chroot environment. Here is a log of the build
14:38 < toluschr> As you can see, the file is present.
14:48 < PrinceGuptaGitte> in this page's Layer API section, to make our own Layer we need to make our getter for gradient matrix : `OutputDataType& Gradient() { return gradient; }`. But in FFN class I didn't notice its use. Is it integral for FFN to work?
14:49 < PrinceGuptaGitte> Same for : `OutputDataType& Parameters() { return weights; }`
14:49 < PrinceGuptaGitte> `InputDataType& InputParameter() { return inputParameter; }`,
14:49 < PrinceGuptaGitte> I didn't see their usage anywhere
14:49 < chopper_inbound[> <toluschr "As you can see, the file libarma"> You can try removing armadillo and remaking the latest version, it worked for me for similar error
14:50 < LakshyaOjhaGitte> whats going on these days, azure, github.....even corona
14:50 < LakshyaOjhaGitte> (edited) ... azure, github.....even corona => ... azure, github.....corona
14:51 < LakshyaOjhaGitte> all being affected
14:52 < LakshyaOjhaGitte> @prince776 I am not sure but I think weights are used for differentiated use between layers.
14:55 < rcurtin> toluschr: can you tell me more about the chroot environment you're using?
14:56 < rcurtin> assuming the chroot is correctly in masterdir/, the make error doesn't seem to make sense
14:57 < rcurtin> maybe the symbolic link fails when inside the chroot or something?
14:58 < favre49> rcurtin Hmm I guess we'll never know. I assume you use firefox too
15:00 < kartikdutt18Gitt> Hey @prince, The Parameter are used to set weights. You can refers to tests of LSTM, I think I saw various params in test for Transposed Convolutions.
15:02 < toluschr> rcurtin It's the void-linux build system. I installed `libarmadillo-devel` and `boost-devel` as dependencies. I don't think that the symlink is a problem, as it is the official package.
15:02 < toluschr> I can send you a log of all commands executed if you want me to
15:03 < rcurtin> favre49: yeah, I also tested in chromium
15:04 < rcurtin> toluschr: hm, interesting, but I guess I'm not sure what happens when something is symlinked outside the chroot; what happens in the chroot if you run `ls -lh /usr/lib/`?
15:04 -!- favre49 [6a331a9d@] has quit [Remote host closed the connection]
15:04 < jeffin143[m]> Rcurtin : you there ??
15:04 < rcurtin> jeffin143[m]: yeah, I'm still here, I have not gotten up in the past 30 seconds :-D
15:06 < toluschr> rcurtin
15:06 < jeffin143[m]> :-p , you left a comment to implement column major in string encoding
15:07 < rcurtin> jeffin143[m]: I haven't gotten back to that PR yet, I'll probably handle it later today if I can
15:07 < rcurtin> toluschr: ok, I guess, I was just worried that the symlink was to a destination *outside* the chroot
15:07 < jeffin143[m]> But I was doubtful how will we handle without padding case
15:07 < rcurtin> but it looks like that is not the case
15:09 < rcurtin> toluschr: I'm a bit at a loss for this one, my thoughts for directions to investigate are:
15:09 < rcurtin> - is this a cross-compilation situation or anything? maybe is compiled for the wrong architecture and the compiler can't use it o?
15:10 < rcurtin> - is the chroot of make actually the exact same as what you're thinking? like, is it possible that does actually not exist in whatever environment 'make' is running in?
15:10 < toluschr> I'm not cross compiling, but I can try adding libarmadillo as a host dependency
15:11 < rcurtin> yeah, maybe worth a shot? honestly I don't have too many clear ideas here
15:12 < toluschr> Still doesn't work. I'm not sure whats going on here. Probably a stupid mistake I made.
15:12 < toluschr> I'm thinking of just removing the check for
15:13 < rcurtin> toluschr: removing from the Makefile directly? I guess that's one strategy that could work :)
15:15 < toluschr> Well now that is really strange. g++ can't find it while linking either.
15:35 < toluschr> rcurtin It has nothing to do with the chroot environment. It even fails on my host system.
15:37 < toluschr> The voidlinux package is broken.
15:37 < toluschr> The symlink points to nothing, because armadillo is not a dependency of armadillo-devel
15:41 < rcurtin> toluschr: well, that could be the reason :-D
15:42 < toluschr> Works now
15:43 < toluschr> I spent an hour on nothing...
15:50 < rcurtin> hehe
15:50 < rcurtin> I know the feeling :)
15:51 < jeffin143[m]> Hehe that was bad
16:11 < zoq> rcurtin: Will open a PR with a quickfix for the ci later.
16:20 < zoq> just needs two approvals :)
16:27 < PrinceGuptaGitte> Thanks @kartikdutt18 .
16:27 < PrinceGuptaGitte> Just a side question, in your LeNet model, why did you use Sequential<> instead of FFN<>
16:27 < rcurtin> zoq: one of two :)
16:31 -!- favre49 [6a331a9d@] has joined #mlpack
16:31 < favre49> zoq I went ahead and merged it. That's a pretty big problem fixed :)
16:32 < rcurtin> just merged it into #2305, seems to work just fine
16:32 -!- toluschr [] has quit [Quit: toluschr]
16:32 < favre49> Ah, so people will need to merge into the new master for it to work
16:39 < rcurtin> yeah, but I don't feel the need to do a big blast of comments on all the PRs :) I was planning on only mentioning it on ones that I was reviewing
16:42 < favre49> Wouldn't it be best if you could have mlpack-bot do announcements across PRs?
16:42 < favre49> This kind of situation isn't normal (don't remember something like this happening last year) but in the case it does, saves you a headache :)
16:47 < kartikdutt18Gitt> @prince776, Sequential allows me to think of it as a layer, So we can add something before the model too, incase the model is a subpart of the some other model. I don't think we can do model1.Add(model2) if both are FFN, So using sequential gives the user some flexibility.
16:50 -!- favre49 [6a331a9d@] has quit [Remote host closed the connection]
16:52 -!- tip [5c2a1c3a@] has joined #mlpack
16:52 -!- tip is now known as Guest7503
16:54 -!- Guest7503 [5c2a1c3a@] has quit [Remote host closed the connection]
17:13 -!- zoso_floyd [~blurryfac@2409:4042:198:432f:9865:581c:7050:e4d1] has joined #mlpack
17:13 -!- zoso_floyd [~blurryfac@2409:4042:198:432f:9865:581c:7050:e4d1] has quit [Client Quit]
17:28 -!- karyam [] has joined #mlpack
17:36 -!- erdem [5c2a1c3a@] has joined #mlpack
17:38 < erdem> Hello, I wonder how to submit proposal to mlpack?
17:40 < himanshu_pathak[> kartikdutt18 (Gitter): Currently in my pr of Neural Turing Machine I am trying to change FFN. So, that it can be used as right now I am stuck due to some bug once I find workaround it will be a nice thing for us to use
17:40 < himanshu_pathak[> *So, that it can be used as a layer
17:43 -!- erdem [5c2a1c3a@] has quit [Remote host closed the connection]
17:46 < kartikdutt18Gitt> Yes, I think that would be great. :)
17:46 < PrinceGuptaGitte> I see. It's a nice idea.
17:50 < himanshu_pathak[> Yes, but my only problrm about it is that everytime when I was able to find workaround new error come into my way.
17:53 < PrinceGuptaGitte> If you get stuck at something for too long, you can always ask here, we are happy to help :)
17:56 -!- karyam [] has quit [Remote host closed the connection]
17:58 < himanshu_pathak[> Yes I am stuck with a problem it is as follows in memory
18:01 < himanshu_pathak[> In memory_head_impl.hpp I am trying to do this gy += prevDW and gy is of type ErrorType and prevDW previously before rvalue refactor it was working but now I am getting an error note: ‘arma::vec {aka arma::Col<double>}’ is not derived from ‘const arma::Op<T1, op_type>’
18:02 < himanshu_pathak[> prevDW is of type arma::vec I don't know why it's not working after r-value refactor
18:04 < PrinceGuptaGitte> I don't know how exactly rvalue references work, I'll have to do some reading I guess.
18:06 < himanshu_pathak[> Maybe ryan have better answer for this.
18:07 < PrinceGuptaGitte> yes
18:30 < rcurtin> himanshu_pathak[: I'm in a meeting right now so my answer will be kind of short but I hope it helps...
18:30 < rcurtin> my guess is that gy was previously inferred as arma::vec, but now there is some type that's being inferred incorrectly
18:31 < rcurtin> so, e.g., if you are doing Forward(a, b + c), where that "b + c" is an Armadillo expression, the type of the second parameter gets inferred as some arma::Op<...> intermediate type
18:31 < rcurtin> the way to fix it might be to form a temporary vector or something for each of the parameters to Forward()
18:32 < rcurtin> or something of this general idea---I hope the input helps
18:37 < himanshu_pathak[> Thanks for the help I will try this outrcurtin: I will try to add temporary vectors may be this will work.
19:12 < jenkins-mlpack2> Project docker mlpack weekly build build #97: STILL FAILING in 6 hr 42 min:
19:12 < jenkins-mlpack2> Ryan Curtin: Copy data before test run.
19:26 < rcurtin> himanshu_pathak[: yeah, in any case, the issue is likely rooted in incorrect types being inferred for template parameters
19:31 < himanshu_pathak[> Yes and in my code I am trying to do operation with incorrect types thanks again for the help 🙂
19:59 -!- M_slack_mlpack13 [slackml_32@gateway/shell/] has joined #mlpack
20:25 -!- M_slack_mlpack13 is now known as TrinhNgo[m]
20:25 * TrinhNgo[m] sent a long message: < >
20:27 < TrinhNgo[m]> I realized that each project idea is attached with some mentors, but I cannot find them on slack channel :((
21:00 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has quit [Quit: Leaving]
22:29 < jenkins-mlpack2> Project docker mlpack nightly build build #643: NOW UNSTABLE in 9 hr 10 min:
22:30 < zoq> TrinhNgo[m]: If you have any question you can ask here, it's generally not a great idea to contact mentors directly (private message).
22:33 < zoq> TrinhNgo[m]: To answer your question, we don't require a contributions to be considered for GSoC. But any contribution is great, because it shows you can work with a larger project.
22:34 < zoq> TrinhNgo[m]: (2) Yes, we have an application guide that should be helpful:
22:36 < zoq> TrinhNgo[m]: (3) - That could be a project yes, some people proposed to implement an object detection model or object segmentation model.
23:08 -!- witness [uid10044@gateway/web/] has quit [Quit: Connection closed for inactivity]
--- Log closed Tue Mar 17 00:00:32 2020