mlpack IRC logs, 2020-04-19
Logs for the day 2020-04-19 (starts at 0:00 UTC) are shown below.
--- Log opened Sun Apr 19 00:00:19 2020
04:29 < Manav-KumarGitte> @zoq can we include it in mlpack?
06:13 -!- Netsplit *.net <-> *.split quits: SaraanshTandonGi, kritika12298Gitt, NishantKumarGitt, RohitKartikGitte, johnsoncarl[m], kunal12298Gitter, robotcatorGitter, AnjishnuGitter[m, HimanshuPathakGi, AniThoGitter[m], (+9 more, use /NETSPLIT to show all of them)
06:15 -!- Netsplit over, joins: PrinceGuptaGitte, AnjishnuGitter[m, kartikdutt18Gitt, robertohueso, johnsoncarl[m], NishantKumarGitt, AbhinavvermaGitt, AbdullahKhilji[m, RohitKartikGitte, SaraanshTandonGi (+9 more)
06:54 -!- ghostrider669 [~firstname.lastname@example.org] has joined #mlpack
07:02 -!- ghostrider669 [~email@example.com] has quit [Ping timeout: 256 seconds]
08:20 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has joined #mlpack
09:20 -!- togo [~togo@2a02:6d40:34f8:8901:91d:3e06:d752:9bb0] has joined #mlpack
09:22 * hemal[m] sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/tfmHjaeirIMShytUxxDCceKr >
09:23 < hemal[m]> ```
09:23 < hemal[m]> oad_save_test.cpp:(.text+0x21e57): undefined reference to `bool mlpack::data::Load<double>(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, arma::SpMat<double>&, bool, bool)'
09:23 < hemal[m]> ```
09:23 * hemal[m] sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/WldAzQbaBUKUaTgjIhiVyyHh >
09:27 * hemal[m] sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/vqMwwOucVNerlQqaqMkAwoCs >
10:17 < JoelJosephGitter> >error: no matching function for call to >‘mlpack::ann::FFN<mlpack::ann::MeanSquaredError<>, >mlpack::ann::GaussianInitialization>::Backward(arma::mat&, arma::mat&)’
10:17 < JoelJosephGitter> what could cause this?
10:25 < hemal[m]> Joel Joseph (Gitter): could you paste the lines of code causing the error ?
10:26 < JoelJosephGitter> https://pastebin.com/Rf0ASne1
10:33 < hemal[m]> Backward() requires 2 parameters, and you have passed only one `actionProbs_target` , that could be the reason. Not sure though
10:34 < JoelJosephGitter> i made an edit in that link now,,, i did put in two arma::mat matrices; i think the reason is some mistake with my "build"...
10:34 < JoelJosephGitter> i'll revert and check..
11:12 * JoelJosephGitter sent a long message: < https://matrix.org/_matrix/media/r0/download/matrix.org/RNkEftMxnXKfqtnNRZGlQbVL >
11:13 < JoelJosephGitter> The problem goes away when I remove the `SoftMax` layer.
11:16 < JoelJosephGitter> I used the ann folder from @mrityunjay-tripathi 's softmax-layer branch
11:17 < JoelJosephGitter> On his fork
11:20 < chopper_inbound[> Joel Joseph (Gitter) : Let me take a look into it.
11:43 < chopper_inbound[> Joel Joseph (Gitter): Can you check the dimensions of the target vector and the dimension of the last "Linear" layer?
11:49 < JoelJosephGitter> As u can see from the code I pasted, the last linear layer has dimensions 2. And the target vector I used for "Backward" is the vector that I got from the output of "Forward".
11:51 < chopper_inbound[> ohh...there is more code? i didn't scrolled down.
11:52 < JoelJosephGitter> `ss` is `3x2` and `as` is `2x2`
11:57 < chopper_inbound[> the input to the backward function is not the actual input. It is the output of the forward function (Here `as`). And in place of `as` in backward function call there should be backpropagated error.
12:00 < JoelJosephGitter> but they have the dimensions, am i right?
12:00 < JoelJosephGitter> *same dimensions
12:00 < JoelJosephGitter> in this case
12:01 < chopper_inbound[> the `activation` here doesn't have the same dimension as the `input`.
12:02 < chopper_inbound[> *activation = output of forward function
12:03 < JoelJosephGitter> This code works when I remove the SoftMax layer... Can u check that?
12:04 < JoelJosephGitter> `.Backward(INPUT_TO_NET, OUTPUT_OF_NET, GRADIENT_MATRIX)`
12:06 < JoelJosephGitter> ohh.. you mean the softmax has input and output dimensions different?
12:07 < JoelJosephGitter> *softmax activation layer
12:07 < chopper_inbound[> No. You can refer to `softmax_impl.hpp` line no. 33 where I set the size of output same as input.
12:08 < JoelJosephGitter> Hmm, then I don't understand why this error occurs
12:09 < JoelJosephGitter> Are u getting this error on your pc?
12:26 < JoelJosephGitter> https://pasteboard.co/J4wNANT.png u can see here that my Forward and Backward code should work if you compare it to the "test" for MeanSquaredError module.
12:35 < chopper_inbound[> I understand. Basically I am getting results when `activation` is a vector and not matrix. The backward function has to be extended for matrix input as well. Maybe you can try flattening the output if that doesn't harm for now. Thanks.
12:35 < JoelJosephGitter> I can't flatten the output since it's going in in batches.
12:36 < JoelJosephGitter> Are u suggesting I do a loop
12:37 < JoelJosephGitter> I am putting in several training inputs in one go.
12:37 < chopper_inbound[> Ok. No problem, I will try to fix that if you give me some time
12:37 < JoelJosephGitter> Thanks for the reply.
12:38 < JoelJosephGitter> :) sure.
18:37 -!- metaljack34 [~user@2605:6000:1b0b:51f:a04b:ede:3ce:16b] has joined #mlpack
18:38 < metaljack34> Does ID3 (decision_tree) implementation support pruning?
18:41 < chopper_inbound[> <JoelJosephGitter ":) sure."> I have made the changes and https://pastebin.com/embed_iframe/FL831rBR works fine. The test earlier has some flaw (I am not quite sure what the error is) as even after the the change there were some errors regardless of the type of "last layer" used. For example: https://pastebin.com/embed_iframe/HVYRwQWf
19:27 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has quit [Quit: Leaving]
22:57 -!- togo [~togo@2a02:6d40:34f8:8901:91d:3e06:d752:9bb0] has quit [Quit: Leaving]
--- Log closed Mon Apr 20 00:00:21 2020