tag:blogger.com,1999:blog-4946490806848569840.post4013128096872833341..comments2023-04-24T16:57:22.851-07:00Comments on R snippets: Generating all subsets of a setUnknownnoreply@blogger.comBlogger7125tag:blogger.com,1999:blog-4946490806848569840.post-31156634314219241572016-12-18T22:07:06.510-08:002016-12-18T22:07:06.510-08:00Thank you very much for your help! I will try to u...Thank you very much for your help! I will try to understand this function.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4946490806848569840.post-13885661492251314632016-12-18T10:04:06.686-08:002016-12-18T10:04:06.686-08:00My function: unfortunately :(, I think that the be...My function: unfortunately :(, I think that the best advice is to read documentation of all functions I use in R help.<br /><br />Generators: https://en.wikipedia.org/wiki/Generator_(computer_programming)Bogumił Kamińskihttps://www.blogger.com/profile/06250268799809238730noreply@blogger.comtag:blogger.com,1999:blog-4946490806848569840.post-41872857698365228842016-12-18T04:47:53.912-08:002016-12-18T04:47:53.912-08:00Thank you very much! I had attempted to generate a...Thank you very much! I had attempted to generate all subsets of {1,...,200} to test an assumption of EBIC, but I just found it would take at least 2^200 bytes to store its power sets. What should I read to understand your all.subsets.fast() function? What should I read to understand "generator"? I am a grad student in stats, have a bsc in math. Thank you very much!Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4946490806848569840.post-9285473693288582672016-12-08T00:39:17.578-08:002016-12-08T00:39:17.578-08:00Such a large structure simply does not fit into me...Such a large structure simply does not fit into memory.<br />You would have to use generator rather than materialized structure. Anyway 2^50=1125899906842624 so you will not be able to iterate over such a large number of elements anyway.Bogumił Kamińskihttps://www.blogger.com/profile/06250268799809238730noreply@blogger.comtag:blogger.com,1999:blog-4946490806848569840.post-7839262773817293912016-12-07T19:02:08.590-08:002016-12-07T19:02:08.590-08:00Hi, your code is much faster than the set_power() ...Hi, your code is much faster than the set_power() in sets package, but when I try all.subsets.fast(seq(1:50)), it says:<br />Error: cannot allocate vector of size 4194304.0 Gb. How can I fix it? Thanks!Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4946490806848569840.post-34470129449898855042012-04-24T14:31:18.159-07:002012-04-24T14:31:18.159-07:00If you care about speed the following code is much...If you care about speed the following code is much faster for large sets:<br /><br />all.subsets.fast <- function(set) {<br /> n <- length(set)<br /> bin <- vector(mode = "list", length = n)<br /> for (i in 1L:n) {<br /> bin[[i]] <- rep.int(c(rep.int(F, 2L ^ (i - 1L)),<br /> rep.int(T, 2L ^ (i - 1L))),<br /> 2L ^ (n - i))<br /> }<br /> apply(do.call(cbind, bin), 1L, function(x) { set[x] } )<br />}<br /><br />However, as you can see, it is more complex.Bogumił Kamińskihttps://www.blogger.com/profile/06250268799809238730noreply@blogger.comtag:blogger.com,1999:blog-4946490806848569840.post-41911627533171113982012-04-23T14:40:20.169-07:002012-04-23T14:40:20.169-07:00How does the time on this compare to combn? I'...How does the time on this compare to combn? I've been curious about alternatives to combn - particularly ones that (like anything with plyr) can use multiple cores for functions that take a looooong time when doing all possible subsets kinds of things.jebyrneshttp://imachordata.comnoreply@blogger.com