## What is this mathematical concept called?

Message boards : Science (non-SETI) : What is this mathematical concept called?
Message board moderation

AuthorMessage
Cosmic_Ocean

Joined: 23 Dec 00
Posts: 2976
Credit: 11,407,630
RAC: 7,150
Message 1828950 - Posted: 7 Nov 2016, 1:47:00 UTC

I was just looking for a hopefully easy way to code systematically going through every permutation.

If I can figure that part out, the rest of it should be pretty easy to figure out which permutation is closest to the target without going over.

I know from my research that a lot of people very strongly advise against--and therefore don't ever show a way to do deploy--going through every possible iteration of an array, because it is easier than you would think to get to billions of permutations, which would end up taking anywhere from days to decades to process. But I still want to do that.

I can easily do it if you know how many elements are in your starting array, but I'm trying to find a way to do it when you don't know how many elements there are, so you can't hard-code that many layers of nested loops.

If it helps anything, I'm pretty sure that I probably won't exceed 50 elements at any point, so if I absolutely have to.. I could just go down that path of hard-coding the logic and loops for 50 or less elements.

As far as the Excel idea.. I'm trying to avoid time-consuming by-hand trial-and-error. As stated in the first post, what I'll do is pick two files, drag them to the compilation, it says 7.34 GiB remaining, so I just look for something near that size without going over. Find something that's 6.91, which means there's about 400 MiB free, so I'll look at the size of each of the first two files and see if there's one that's about 400 MiB larger. It's not perfect.. and it kind of works for the most part, but I'd like to find a more efficient way to do this, and it seems like a simpler way to do it would be to have something that can do thousands to millions of iterations per second seems like a great way to go.

However... I think it's probably one of these situations:

XKCD - 1319

Linux laptop:
record uptime: 1511d 20h 19m (ended due to the power brick giving-up)
ID: 1828950 ·
Cosmic_Ocean

Joined: 23 Dec 00
Posts: 2976
Credit: 11,407,630
RAC: 7,150
Message 1846064 - Posted: 3 Feb 2017, 3:29:17 UTC

So I tried wrapping my head around what needed to be done. My knowledge is in C++, but not very extensive or advanced knowledge. I poked at it for a few hours trying to figure out the logic behind doing all the nested loops to calculate every iteration possible, and then gave up on it for a while.

A friend of mine that codes in 6 different languages and is pretty good at all of them said they could take a look at it, so they started working on it, and then another google search when trying to help them understand the difference between "Bin-Packing Problem" and "Knapsack Problem" and somehow stumbled across a piece of software that already exists.

I can't seem to find a working/functional official site for it, but this one looks the least-sketchy and ad-laden.. Automatic Disc Fit.

By default, it has the presets for the various mediums, but it has their absolute max values and does not take into account CDFS/UDF overhead. So I do a custom size of 25,023,000,000 bytes, dragged 146 files into it, and the first dozen or so all have less than 5mb free space and then it gets to being more and more free space as the available puzzle pieces become fewer and fewer. Not only will it just display what the arrangement is, but even has the ability to copy or move the files into folders at a specified destination that are ready to begin burning.

It looks pretty handy and I like it. And it looks like what I had in my head as an eventual end-result for a piece of software.
Linux laptop:
record uptime: 1511d 20h 19m (ended due to the power brick giving-up)
ID: 1846064 ·
Gone with the wind (2)
Volunteer tester

Joined: 19 Nov 00
Posts: 41571
Credit: 41,951,437
RAC: 18
Message 1846112 - Posted: 3 Feb 2017, 7:37:27 UTC

Gosh this takes me back to the early days when we had to try to cram as much as possible into upper memory UMB's above 640K. The problem was that some programs loaded large then shrunk smaller. Norton Utilities had a tool for doing it. It was a kind of bubble sort I think. Happy days, if you got 690k you were doing well :-))
ID: 1846112 ·
bluestar

Joined: 5 Sep 12
Posts: 2898
Credit: 1,969,058
RAC: 0
Message 1846423 - Posted: 4 Feb 2017, 6:43:05 UTC

http://factordb.com/index.php?id=1100000000900490133

The OP makes this problem a bit difficult by placing two separate conditions for the possible outcome.

I made a file on this earlier on, so this may not be the final answer yet.

When it comes to the OP, you first multiply all the numbers with each other.

Next consider such a thing as the organization of a computer disk.

Before the purchase of this computer and next downloading the Yafu software, I was left with a calculator in order to compute the size being used by a hard disk.

I found this not to work very well and the method of placing the temporary results below each other, aligned from right towards left made for errors being made quite often.

You receive or get the carry all the time, which needs to be added to the corresponding numbers below.

For a given disk, you will have to consider both sectors, clusters, tracks, or cylinders and possible even more.

That of cylinders when it comes to disks could be slightly difficult to comprehend.

This means at least such things as volumes, or rather partitions, for such disks, which could be different sizes.
ID: 1846423 ·
Previous · 1 · 2

Message boards : Science (non-SETI) : What is this mathematical concept called?