mlpack IRC logs, 2020-02-15
Logs for the day 2020-02-15 (starts at 0:00 UTC) are shown below.
--- Log opened Sat Feb 15 00:00:02 2020
00:00 -!- petris_ is now known as petris
01:57 < tejasvi[m]> I'm hoping for someone to review https://github.com/mlpack/mlpack/pull/2127 I've finished adding the tests
03:43 < kartikdutt18Gitt> Hi @tejasvi, I have left some comments, Have a look when you get the chance.
04:16 -!- volhard[m] [volhardmat@gateway/shell/matrix.org/x-hfwcmlweifvxsmcj] has joined #mlpack
04:17 < volhard[m]> Should I feed a recurrent net (GRU) audio in the time or frequency domain?
04:51 < kartikdutt18Gitt> Hi @volhard, you can work in time domain also but generally data is converted into frequency domain using techniques such as stft.
04:56 < kartikdutt18Gitt> I think in frequency patterns become more apparent, if you have heard about tdft, it is a discrete version of fourier transform which is periodic.
04:57 < kartikdutt18Gitt> *dtft
05:01 < kartikdutt18Gitt> You can read more about stft here, https://www.dsprelated.com/freebooks/sasp/Short_Time_Fourier_Transform.html
05:26 -!- Yihan [firstname.lastname@example.org] has joined #mlpack
06:00 < metahost> volhard: you may! Tasks like wake word detection (like Hey Siri), use CNNs (+ sliding window) to predict when a phrase is detected. If you need contextual information, you should probably use RNNs.
06:01 < metahost> Here's a tutorial you may find useful: https://www.dlology.com/blog/how-to-do-real-time-trigger-word-detection-with-keras/
06:02 < metahost> But yes, taking a FT makes the individual frequency components stand out
06:03 < metahost> Here's another link: https://github.com/MycroftAI/mycroft-precise (Check the how it works section)
07:02 -!- Yihan [email@example.com] has quit [Ping timeout: 260 seconds]
07:52 < PrinceGuptaGitte> Hi @kartikdutt18 Thanks for review. I've done the fixes suggested by you on PR #2208 , I have also cited the FaceNet paper.
08:13 -!- zwasd [~zwasd@unaffiliated/zwasd] has joined #mlpack
08:18 < jenkins-mlpack2> Project docker mlpack nightly build build #614: UNSTABLE in 3 hr 4 min: http://ci.mlpack.org/job/docker%20mlpack%20nightly%20build/614/
08:28 -!- zwasd [~zwasd@unaffiliated/zwasd] has quit [Quit: Leaving]
08:47 < LakshyaOjhaGitte> Hi @zoq can you please restart the appveyor test in my pr of softshrink https://github.com/mlpack/mlpack/pull/2174
08:47 < LakshyaOjhaGitte> Thanks.
09:03 < kartikdutt18Gitt> Hi @prince776, I will take a look. Thanks.
09:26 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has joined #mlpack
09:48 -!- rohit [firstname.lastname@example.org] has joined #mlpack
10:03 -!- rohit [email@example.com] has quit [Ping timeout: 260 seconds]
11:39 -!- johnsoncarl[m] [johnsoncar@gateway/shell/matrix.org/x-zwbubtpcoenwnrha] has joined #mlpack
11:42 < johnsoncarl[m]> anyone here use something like a virtual env to build and use mlpack?
11:42 < johnsoncarl[m]> so that i can delete the trash ones when not needed!
11:52 < zoq> johnsoncarl: I used the conda env.
11:52 < johnsoncarl[m]> ah. okay.
11:52 < johnsoncarl[m]> Thanks
11:52 < johnsoncarl[m]> zoq:
11:53 < johnsoncarl[m]> looks like i am already using it! :)
13:06 < zoq> LakshyaOjha: Looks like it's already queued.
13:11 < chopper_inbound[> Hi zoq, can you review this. I am waiting this to be merged😁
13:11 < chopper_inbound[> <https://github.com/mlpack/mlpack/pull/2196>
13:24 < zoq> chopper_in: Will do later today.
13:25 < chopper_inbound[> Thanks zoq
13:57 < kartikdutt18Gitt> Hi @zoq, could you have a look at #2195, I wanted to know how I should proceed with it.
14:26 -!- pranay2 [firstname.lastname@example.org] has joined #mlpack
14:28 -!- pranay2 [email@example.com] has left #mlpack 
14:28 < PrinceGuptaGitte> Hi everyone, I was trying to get better understanding of codebase of ann. I have a doubt.
14:28 < PrinceGuptaGitte> Why is everything templated instead of using inheritance approach? For example BaseLayer class acts as a template for activation layers where we can use any type of activation function. We could have also done it so that BaseLayer's takes in a ActivationFunction type and all activation functions inherit from it. Could it be because of virtual functions being slow? Or is there some other reason.
14:28 < PrinceGuptaGitte> I'm sorry if it's a silly doubt but I don't understand why would we wanna template everything.
14:28 -!- pranay2 [firstname.lastname@example.org] has joined #mlpack
14:29 < PrinceGuptaGitte> (edited) ... we wanna template ... => ... we want to template ...
14:31 -!- pranay [email@example.com] has joined #mlpack
14:32 < GauravSinghGitte> Hey, @prince776 you can read [this](https://www.mlpack.org/papers/mlpack2011.pdf) paper of mlpack. It explains your doubt in detail.
14:32 < PrinceGuptaGitte> Thanks for the reference
14:38 -!- pranay2 [firstname.lastname@example.org] has quit [Remote host closed the connection]
14:39 -!- pranay2 [email@example.com] has joined #mlpack
14:43 < GauravSinghGitte> Hi, @zoq kindly have a look at #2191 I have incorporated the changed that you suggested. Thank you.
14:46 -!- pranay [firstname.lastname@example.org] has quit [Remote host closed the connection]
14:46 -!- pranay2 [email@example.com] has quit [Remote host closed the connection]
14:47 < GauravSinghGitte> (edited) ... the changed that ... => ... the changes that ...
15:02 -!- lozhnikov_ [~firstname.lastname@example.org] has quit [Ping timeout: 268 seconds]
15:08 -!- lozhnikov [~email@example.com] has joined #mlpack
16:30 < tejasvi[m]> What should be the ideal workflow of development? The way I debug is tragic. After making changes in the file, I build and run mlpack_test and use BOOST_TEST_MESSAGE as a cout incarnate. Given the ~45m build time this isn't helpful enough. I tried gdb but it refuses to drill beyond the code of test file. Should I use gdb with handmade code like examples in https://www.mlpack.org/doc/mlpack-3.2.2/doxygen/sample.html? I'm
16:30 < tejasvi[m]> bit swamped here.
17:34 < SriramSKGitter[m> @tstomar[m] I've noticed much faster build times on subsequent builds. It's only the first time that its ~45 min. Building with -jN and building only specific targets ought to bring compile time down to reasonable levels.
17:38 < PrinceGuptaGitte> @tstomar[m] Also whenever you run "make" command it only builds the files that changed (and the connected files)
17:40 < PrinceGuptaGitte> @kartikdutt18 I've noticed MLpack doesn't have regular softmax, it only have log softmax. Should I add regular softmax layer?
17:45 < zoq> PrinceGupta: There is an open PR that implements the regular SoftMax layer.
17:47 < sreenik[m]> Yes, I remember having started it but didn't finish it
17:47 < sreenik[m]> If I remember correctly there is probably some mistake in that PR but it would be a lot easier to finish it up rather than take up the work from scratch
17:48 < zoq> agreed
17:48 < sreenik[m]> In case anyone is interested, feel free to take it up
18:08 -!- EL-SHREIFGitter[ [gitterel-s@gateway/shell/matrix.org/x-ffaqnewvlqzomilb] has joined #mlpack
18:08 < EL-SHREIFGitter[> why their is no mentor in Visualization Tool project for GSoC 2020?
18:10 < zoq> EL-SHREIF: Nice catch, will update it later.
18:11 * PrinceGuptaGitte sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/eqBrxdXKlakTiQPFRejGseCg >
18:12 < PrinceGuptaGitte> I don't understand what's the problem here
18:12 < zoq> PrinceGupta: My first guess is the input size isn't correct.
18:13 < PrinceGuptaGitte> Input size is (42000,784) where 42000 is no.. of data samples
18:13 < PrinceGuptaGitte> and ouput is one ho encoded matrix of (42000,10)
18:13 < PrinceGuptaGitte> (edited) ... one ho encoded ... => ... one hot encoded ...
18:17 < zoq> PrinceGupta: Note armadillo column major, so the matrix size should be (784,42000).
18:22 < PrinceGuptaGitte> ok I'll try to transpose them
18:29 * PrinceGuptaGitte sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/amkXrgHeRFErBkeenGKvrEVe >
18:38 < PrinceGuptaGitte> @zoq apparently the program works when I only take 1 sample (from the 42000 available) and then feed that to `.Train()` function.
18:43 < GauravSinghGitte> I don't know if it would be helpful but you can have a look at [this](https://github.com/mlpack/models/blob/master/Kaggle/DigitRecognizer/src/DigitRecognizer.cpp) this also perform the same task you are trying to implement.
18:44 < GauravSinghGitte> @prince776
18:49 < zoq> agreed, the NegativeLogLikelihood expects a scalar between [1, numer of classes], and not a one hot encoded target.
18:57 < PrinceGuptaGitte> Thanks for the help.
19:35 -!- ibtihaj [firstname.lastname@example.org] has joined #mlpack
20:14 -!- Vishwas254 [email@example.com] has joined #mlpack
20:34 -!- ibtihaj [firstname.lastname@example.org] has quit [Quit: Ping timeout (120 seconds)]
20:42 -!- travis-ci [~email@example.com] has joined #mlpack
20:42 < travis-ci> shrit/ensmallen#3 (citations - 0a66f6c : Omar Shrit): The build passed.
20:42 < travis-ci> Change view : https://github.com/shrit/ensmallen/commit/0a66f6cd2c84
20:42 < travis-ci> Build details : https://travis-ci.com/shrit/ensmallen/builds/149112389
20:42 -!- travis-ci [~firstname.lastname@example.org] has left #mlpack 
21:04 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has quit [Quit: Leaving]
21:12 -!- Vishwas254 [email@example.com] has quit [Remote host closed the connection]
23:14 < zoq> ToshalAgrawal: Do you like to add yourself as a mentor for the Visualization Tool idea?
--- Log closed Sun Feb 16 00:00:04 2020