Is there any way to store big data?

Apr 14, 2013 at 3:50 AM
Edited Apr 14, 2013 at 3:53 AM
Is there any way to store a large byte array?
I try to use the Store method generally :
fac.Store(bytes);
But, I found a array of 1000,000 length used 18MB in my HDD !
Is ther any more efficient way to store large data?

TAG:
I tried method DefragmentTo, but it still as large as original...
Coordinator
Apr 14, 2013 at 4:51 PM
Hello,
I agree, NDatabase is not designed to store in efficient way binary data (media), and there is only one way of storing objects (it is the same for all).
Apr 18, 2013 at 3:28 AM
Oh, it seems little disappointing.
While, I think I found a solution : save the data to a extern file and just store adress of the file in database, though it make database become multi-file...
Thanks still!
Coordinator
Apr 29, 2013 at 7:01 AM
Hello,
I propose to rise the ticket with new feature, to store media data (binary data) as the part of NDatabase. If there will be wish from more users, I will be happy to add it.

Thanks,
Jacek
Apr 30, 2013 at 3:10 AM
I need this function too. Now I use trigger to let stream save & load from file. but I feel it is not the best way.
But if save these data into ndatabase, will it let loading become very heavy?
May 1, 2013 at 3:52 AM
Edited May 1, 2013 at 3:56 AM
Well, it's exciting. You know, it make NDatabase not only a database but a file\data manager for .NET rely on its simpleness and NoSQL. Just store the target, then ask when you need. We don't need to face the Serializer, FileStream, FileName, or other some low level file system. And more, it make our config files become just one file. How cool it is !
So most people would be glad to greet this function, I think.
May 1, 2013 at 4:25 AM
HectorTsay wrote:
But if save these data into ndatabase, will it let loading become very heavy?
Well, I think it not a problem.
I have wrote program to provide mutip FileStream in a just one file.
Like a link list, it devide the file to several block ( 1KB each block ). And there is a 32-bit integer at the end of each block, which indicate the next block in current stream.
At least, it a solution with efficiency.
May 1, 2013 at 4:26 AM
Yes, I agree.I try to dump large amounts of data to a file, but it requires the use of the trigger function.However, when the data structure is inherited object, I encountered another problem.Is trigger of NDatabase does not support inheritance.So still hope of quickly make this function!
Jun 14, 2013 at 5:35 AM
jacek wrote:
Hello,
I propose to rise the ticket with new feature, to store media data (binary data) as the part of NDatabase. If there will be wish from more users, I will be happy to add it.

Thanks,
Jacek
I second add this new feature!
I'm plan to use NDatabase to store some big binary content, and with test I found that the db increased quickly, so I decide to use NDatabase + RaptorDB for my solution presently. When this new feature on NDb finished, I will drop RaptorDB.
Jun 14, 2013 at 9:29 AM
Well, I'm looking forward to this new feature.
I had looked the db generated by current NDataBase. It seems there are a little control character around each element. And I think it seems infrequent to modify a part of the media data (we often read all or write all), which means the control characters arount bytes is little wasteful (I just guess). So, we may learn from .net assembly which keep a strings heap. Divide db file into two part: one is normal db as the previous, and another storage sequential binary data. Then there is only a token in norml db amongst control characters, which indicate the adress of real data in binary heap.
Jul 4, 2013 at 11:51 AM
Edited Jul 4, 2013 at 11:54 AM
Please really do implement this, it would make my life much easier. While fooling around this db everything seems fine except storing binary data. Every time I need to do it, I'm getting ridiculously large files - over 50MB db file is a joke when testing small sets of data. I rather won't think of a real-life scenario.
Jul 10, 2014 at 10:58 AM
Edited Jul 10, 2014 at 10:59 AM
If this helps anyone, I was having the same issue with byte[] which I was using for image files. A 56KB file was adding about 1.5MB to the database. To get around this, and not have to use external files, I encoded the byte array to a string before saving the object, then decoded back to a byte array to use the data. The increase in the db size now seems to reflect the size of the byte[] correctly. I don't know if this would work/be efficient for large files, but it seems to work for me.
    public static string imageDataToString(byte[] imageData)
    {
        return System.Text.Encoding.Default.GetString(imageData);
    }

    public static byte[] imageStringToData(string imageString)
    {
        return System.Text.Encoding.Default.GetBytes(imageString);
    }
Coordinator
Aug 10, 2014 at 8:38 AM
Hello,
please have in mind, that NDatabase is not designed to contain big data/binary data, it's supposed to work with objects and solve complexity around them, not the raw data.

Thanks,
Jacek
Oct 3, 2014 at 6:16 PM
Here is a class to handle that
http://pastebin.de/141459
public ImageString Image { get; set; }