# Utility to write huge files instantly??? - Storage

This is a discussion on Utility to write huge files instantly??? - Storage ; mike wrote: .... > I found the solution in another newsgroup. Great - sorry I didn't have time to research it sufficiently here. Sharing what you found would be nice, though: I'm curious how to do what you wanted to ...

# Thread: Utility to write huge files instantly???

1. ## Re: Utility to write huge files instantly???

mike wrote:

....

> I found the solution in another newsgroup.

Great - sorry I didn't have time to research it sufficiently here.

Sharing what you found would be nice, though: I'm curious how to do
what you wanted to do, because it seemed to me to be an original
deficiency in the Win32 interface and if SetEndOfFile doesn't remedy
that I'd like to know what does (other than going down to the native API).

- bill

2. ## Re: Utility to write huge files instantly???

Bill Todd wrote:
>
> ...
>
>>> That's clearly not what he wants.

>>
>> What he really want is an instant free answer to some strange problem.

>
> And for some reason you seem to have a problem with that.

I only have a problem when the person with a problem gets crabby and

> Others here don't. If you have nothing useful to contribute, you might
> consider just shutting up.

If usenet is too tough for you, you might consider leaving.

3. ## Re: Utility to write huge files instantly???

> Bill Todd wrote:
>>
>> ...
>>
>>>> That's clearly not what he wants.
>>> What he really want is an instant free answer to some strange problem.

>> And for some reason you seem to have a problem with that.

>
> I only have a problem when the person with a problem gets crabby and

Polite requests for information don't qualify as 'demands' in my book.
Perhaps you're just a bit thin-skinned about having been wrong about the
existence of the answer he was looking for.

>
>> Others here don't. If you have nothing useful to contribute, you might
>> consider just shutting up.

>
> If usenet is too tough for you, you might consider leaving.

I've probably been programming computers since before you were in
diapers, sonny - and I'll likely be here long after you've gone on to
something you're more competent at doing.

- bill

4. ## Re: Utility to write huge files instantly???

Bill Todd wrote:
>> Bill Todd wrote:
>>>
>>> ...
>>>
>>>>> That's clearly not what he wants.
>>>> What he really want is an instant free answer to some strange problem.
>>> And for some reason you seem to have a problem with that.

>>
>> I only have a problem when the person with a problem gets crabby and

>
> Polite requests for information don't qualify as 'demands' in my book.
> Perhaps you're just a bit thin-skinned about having been wrong about the
> existence of the answer he was looking for.
>
>>
>>> Others here don't. If you have nothing useful to contribute, you might
>>> consider just shutting up.

>>
>> If usenet is too tough for you, you might consider leaving.

>
> I've probably been programming computers since before you were in
> diapers, sonny - and I'll likely be here long after you've gone on to
> something you're more competent at doing.
>
> - bill

You didn't contribute anything useful just now. Maybe you should consider
just shutting up.

That's what a wise man suggested to me.

5. ## Re: Utility to write huge files instantly???

> Bill Todd wrote:
>>> Bill Todd wrote:
>>>>
>>>> ...
>>>>
>>>>>> That's clearly not what he wants.
>>>>> What he really want is an instant free answer to some strange problem.
>>>> And for some reason you seem to have a problem with that.
>>> I only have a problem when the person with a problem gets crabby and

>> Polite requests for information don't qualify as 'demands' in my book.
>> Perhaps you're just a bit thin-skinned about having been wrong about the
>> existence of the answer he was looking for.
>>
>>>
>>>> Others here don't. If you have nothing useful to contribute, you might
>>>> consider just shutting up.
>>> If usenet is too tough for you, you might consider leaving.

>> I've probably been programming computers since before you were in
>> diapers, sonny - and I'll likely be here long after you've gone on to
>> something you're more competent at doing.
>>
>> - bill

>
> You didn't contribute anything useful just now.

You're right: correcting the clueless is often wasted effort.

But you never know until you've tried.

- bill

6. ## Re: Utility to write huge files instantly???

Bill Todd wrote:
> mike wrote:
>
> ...
>
>> I found the solution in another newsgroup.

>
> Great - sorry I didn't have time to research it sufficiently here.

Thanks, but it's not your job to research it. I ask questions just in
case someone already knows the answer. If they have to figure it out,
they'd be doing my job.
>
> Sharing what you found would be nice, though: I'm curious how to do
> what you wanted to do, because it seemed to me to be an original
> deficiency in the Win32 interface and if SetEndOfFile doesn't remedy
> that I'd like to know what does (other than going down to the native API).
>
> - bill

After playing with it for a while, I've decided that SetEndOfFile
does exactly what you'd want it to do in a NORMAL situation.
Problem is that my situation isn't anywhere near NORMAL.
Normal stuff is boring ;-)

mike

--

7. ## Re: Utility to write huge files instantly???

Bill Todd wrote:
>> Bill Todd wrote:
>>>> Bill Todd wrote:
>>>>>
>>>>> ...
>>>>>
>>>>>>> That's clearly not what he wants.
>>>>>> What he really want is an instant free answer to some strange problem.
>>>>> And for some reason you seem to have a problem with that.
>>>> I only have a problem when the person with a problem gets crabby and
>>> Polite requests for information don't qualify as 'demands' in my book.
>>> Perhaps you're just a bit thin-skinned about having been wrong about the
>>> existence of the answer he was looking for.
>>>
>>>>
>>>>> Others here don't. If you have nothing useful to contribute, you might
>>>>> consider just shutting up.
>>>> If usenet is too tough for you, you might consider leaving.
>>> I've probably been programming computers since before you were in
>>> diapers, sonny - and I'll likely be here long after you've gone on to
>>> something you're more competent at doing.
>>>
>>> - bill

>>
>> You didn't contribute anything useful just now.

>
> You're right: correcting the clueless is often wasted effort.
>
> But you never know until you've tried.
>
> - bill

So are you saying you don't learn and are prone to wasting time?

8. ## Re: Utility to write huge files instantly???

> Bill Todd wrote:
>>> Bill Todd wrote:
>>>>> Bill Todd wrote:
>>>>>>
>>>>>> ...
>>>>>>
>>>>>>>> That's clearly not what he wants.
>>>>>>> What he really want is an instant free answer to some strange problem.
>>>>>> And for some reason you seem to have a problem with that.
>>>>> I only have a problem when the person with a problem gets crabby and
>>>> Polite requests for information don't qualify as 'demands' in my book.
>>>> Perhaps you're just a bit thin-skinned about having been wrong about the
>>>> existence of the answer he was looking for.
>>>>
>>>>>
>>>>>> Others here don't. If you have nothing useful to contribute, you might
>>>>>> consider just shutting up.
>>>>> If usenet is too tough for you, you might consider leaving.
>>>> I've probably been programming computers since before you were in
>>>> diapers, sonny - and I'll likely be here long after you've gone on to
>>>> something you're more competent at doing.
>>>>
>>>> - bill
>>> You didn't contribute anything useful just now.

>> You're right: correcting the clueless is often wasted effort.
>>
>> But you never know until you've tried.
>>
>> - bill

>
> So are you saying you don't learn and are prone to wasting time?

Not at all: just explaining why my earlier message only *turned out* to
be useless, rather than was so in principle (hence was worth posting on
the off-chance that you were capable of benefiting from it).

- bill

9. ## Re: Utility to write huge files instantly???

On Jan 19, 3:48 pm, mike wrote:
> Bill Todd wrote:
> > Maxim S. Shatskih wrote:
> >>> I don't think he wants to take the time to zero (or otherwise write)
> >>> the actual file data. ISTR that an (undocumented?) API exists to

>
> >> SetEndOfFile, it is documented.

>
> > Indeed, and that's clearly the right approach since it seems to
> > accomplish what's desired using a documented interface. But it was not
> > what I was remembering (see my other reply).

>
> > - bill

>
> I have two problems writing code. I'm lazy.
> And I have no idea what I'm doing.
> I found a VB6 code example and hacked it as follows:
>
> Private Sub Command1_Click()
> Path = "x:\big1.txt"
> hFile = CreateFile(Path, GENERIC_WRITE, FILE_SHARE_READ Or
> FILE_SHARE_WRITE, ByVal 0&, OPEN_ALWAYS, 0, 0)
> If hFile = -1 Then End
> WriteFile hFile, ByVal "Very-very cool & long string", 28, BytesWritten,
> ByVal 0&
>
> SetFilePointer hFile, 900000000, 0, FILE_BEGIN
> SetEndOfFile hFile
> CloseHandle hFile
> ...
>
> This does make a big file, but it fills the file with zeros
> and takes 18 minutes to do it.
> Am I using the wrong arguments?
>
> I need to do this 16 times...I need to get rid of the 18 minutes X 16...

You probably want to use SetFileValidData(). You'll need to authorize
yourself properly first (you need to set the SE_MANAGE_VOLUME_NAME
- and you'll need to be an administrator to do that).

10. ## Re: Utility to write huge files instantly???

robertwessel2@yahoo.com wrote:

....

> You probably want to use SetFileValidData().

He'd only need that if he wanted to be able to *access* the garbage in
the file, which he said he doesn't need to do.

- bill

11. ## Re: Utility to write huge files instantly???

Bill Todd wrote:
>> Bill Todd wrote:
>>>> Bill Todd wrote:
>>>>>> Bill Todd wrote:
>>>>>>>
>>>>>>> ...
>>>>>>>
>>>>>>>>> That's clearly not what he wants.
>>>>>>>> What he really want is an instant free answer to some strange problem.
>>>>>>> And for some reason you seem to have a problem with that.
>>>>>> I only have a problem when the person with a problem gets crabby and
>>>>> Polite requests for information don't qualify as 'demands' in my book.
>>>>> Perhaps you're just a bit thin-skinned about having been wrong about the
>>>>> existence of the answer he was looking for.
>>>>>
>>>>>>
>>>>>>> Others here don't. If you have nothing useful to contribute, you might
>>>>>>> consider just shutting up.
>>>>>> If usenet is too tough for you, you might consider leaving.
>>>>> I've probably been programming computers since before you were in
>>>>> diapers, sonny - and I'll likely be here long after you've gone on to
>>>>> something you're more competent at doing.
>>>>>
>>>>> - bill
>>>> You didn't contribute anything useful just now.
>>> You're right: correcting the clueless is often wasted effort.
>>>
>>> But you never know until you've tried.
>>>
>>> - bill

>>
>> So are you saying you don't learn and are prone to wasting time?

>
> Not at all: just explaining why my earlier message only *turned out* to
> be useless, rather than was so in principle (hence was worth posting on
> the off-chance that you were capable of benefiting from it).
>
>
> - bill

Well, when you're back in diapers I'll let you know how my job works out.

12. ## Re: Utility to write huge files instantly???

On Jan 23, 1:41*am, Bill Todd wrote:
> robertwess...@yahoo.com wrote:
>
> ...
>
> > You probably want to use SetFileValidData().

>
> He'd only need that if he wanted to be able to *access* the garbage in
> the file, which he said he doesn't need to do.

Well, the idea was that he'd be able to allocate space to a file
without having the OS zero all that space.

Anyway, I played with this a bit, and using the SetFilePointer/
SetEndOfFile technique, Windows, at least on XP and Vista, *will*
allocate space without zeroing it (but will read it as zeros), so long
as the volume is NTFS. That makes a certain sense, since there's no
place in FAT to store such information.

He does appear to zero pages as needed if you write actual data in the
file. So while the allocation is very quick, seeking to the end of
the file and writing a byte gets the entire file zeroed.

OTOH, you *can* read all the zeros without delay.

To do this on a removable drive you need to force the format to NTFS
(FAT being the default), which requires changing a parameter for the
drive.

13. ## Re: Utility to write huge files instantly???

> Bill Todd wrote:
>>> Bill Todd wrote:
>>>>> Bill Todd wrote:
>>>>>>> Bill Todd wrote:
>>>>>>>>
>>>>>>>> ...
>>>>>>>>
>>>>>>>>>> That's clearly not what he wants.
>>>>>>>>> What he really want is an instant free answer to some strange problem.
>>>>>>>> And for some reason you seem to have a problem with that.
>>>>>>> I only have a problem when the person with a problem gets crabby and
>>>>>> Polite requests for information don't qualify as 'demands' in my book.
>>>>>> Perhaps you're just a bit thin-skinned about having been wrong about the
>>>>>> existence of the answer he was looking for.
>>>>>>
>>>>>>>
>>>>>>>> Others here don't. If you have nothing useful to contribute, you might
>>>>>>>> consider just shutting up.
>>>>>>> If usenet is too tough for you, you might consider leaving.
>>>>>> I've probably been programming computers since before you were in
>>>>>> diapers, sonny - and I'll likely be here long after you've gone on to
>>>>>> something you're more competent at doing.
>>>>>>
>>>>>> - bill
>>>>> You didn't contribute anything useful just now.
>>>> You're right: correcting the clueless is often wasted effort.
>>>>
>>>> But you never know until you've tried.
>>>>
>>>> - bill
>>> So are you saying you don't learn and are prone to wasting time?

>> Not at all: just explaining why my earlier message only *turned out* to
>> be useless, rather than was so in principle (hence was worth posting on
>> the off-chance that you were capable of benefiting from it).
>>
>>
>> - bill

>
> Well, when you're back in diapers I'll let you know how my job works out.

Unlike you I didn't ask, so you needn't bother. But who knows? If you
manage to remain in a technical job that long, you may actually become
competent at it.

- bill

14. ## Re: Utility to write huge files instantly???

robertwessel2@yahoo.com wrote:
> On Jan 23, 1:41 am, Bill Todd wrote:
>> robertwess...@yahoo.com wrote:
>>
>> ...
>>
>>> You probably want to use SetFileValidData().

>> He'd only need that if he wanted to be able to *access* the garbage in
>> the file, which he said he doesn't need to do.

>
>
> Well, the idea was that he'd be able to allocate space to a file
> without having the OS zero all that space.

Yes - and there's no intrinsic reason why SetFileValidData should have
any effect on that: the normal byte-granularity end-of-data marker
should suffice regardless of where the allocation ends, unless it's
maintained only as a small integer offset within the last file cluster
rather than as a 64-bit integer.

>
> Anyway, I played with this a bit, and using the SetFilePointer/
> SetEndOfFile technique,

But not SetFileValidData?

Windows, at least on XP and Vista, *will*
> allocate space without zeroing it (but will read it as zeros), so long
> as the volume is NTFS. That makes a certain sense, since there's no
> place in FAT to store such information.

A quick search of mike's postings to other newsgroups seems to indicate
that his problem (zeroing the space allocated) occurred at Close time,
not at allocation time. He says that SetFileValidData (to a small
value) before closing the file did alleviate that, but not whether it
also released the unused space at Close (which would be consistent with,
say, maintaining ValidDataLength only in RAM rather than on disk).

>
> He does appear to zero pages as needed if you write actual data in the
> file. So while the allocation is very quick, seeking to the end of
> the file and writing a byte gets the entire file zeroed.
>
> OTOH, you *can* read all the zeros without delay.

Hmmm. If you can do that *without* using SetFileValidData, then
apparently SetEndOfFile is moving the end-of-data mark rather than just
allocating space - unlike (I think, though I haven't tried it) the case
with using the NtCreateFile approach with an AllocationSize.

And in that case using SetFileValidData to move the end-of-data mark
back to the start of the file would avoid the zeroing on Close that mike
saw (without necessarily deallocating the space).

>
> To do this on a removable drive you need to force the format to NTFS
> (FAT being the default), which requires changing a parameter for the
> drive.

Yeah - he did originally say FAT32 *or* NTFS, but all the subsequent
discussion (including mention of sparse files) assumed the NTFS facilities.

- bill

15. ## Re: Utility to write huge files instantly???

Bill Todd wrote:
>> Bill Todd wrote:
>>>> Bill Todd wrote:
>>>>>> Bill Todd wrote:
>>>>>>>> Bill Todd wrote:
>>>>>>>>>
>>>>>>>>> ...
>>>>>>>>>
>>>>>>>>>>> That's clearly not what he wants.
>>>>>>>>>> What he really want is an instant free answer to some strange problem.
>>>>>>>>> And for some reason you seem to have a problem with that.
>>>>>>>> I only have a problem when the person with a problem gets crabby and
>>>>>>> Polite requests for information don't qualify as 'demands' in my book.
>>>>>>> Perhaps you're just a bit thin-skinned about having been wrong about the
>>>>>>> existence of the answer he was looking for.
>>>>>>>
>>>>>>>>
>>>>>>>>> Others here don't. If you have nothing useful to contribute, you might
>>>>>>>>> consider just shutting up.
>>>>>>>> If usenet is too tough for you, you might consider leaving.
>>>>>>> I've probably been programming computers since before you were in
>>>>>>> diapers, sonny - and I'll likely be here long after you've gone on to
>>>>>>> something you're more competent at doing.
>>>>>>>
>>>>>>> - bill
>>>>>> You didn't contribute anything useful just now.
>>>>> You're right: correcting the clueless is often wasted effort.
>>>>>
>>>>> But you never know until you've tried.
>>>>>
>>>>> - bill
>>>> So are you saying you don't learn and are prone to wasting time?
>>> Not at all: just explaining why my earlier message only *turned out* to
>>> be useless, rather than was so in principle (hence was worth posting on
>>> the off-chance that you were capable of benefiting from it).
>>>
>>>
>>> - bill

>>
>> Well, when you're back in diapers I'll let you know how my job works out.

>
> Unlike you I didn't ask, so you needn't bother. But who knows? If you
> manage to remain in a technical job that long, you may actually become
> competent at it.
>
> - bill

Is that how things worked out for you?

16. ## Re: Utility to write huge files instantly???

....

> Is that how things worked out for you?

I can see how you might think that your own deficiencies (to the degree
that you recognize them at all) tend to be shared by the rest of the
world as well, and - regrettably - you might even be correct a lot of
the time in that assessment.

But not in this case (again, since you asked). Feel free to post a
final inane comment now if that's important to you.

- bill

17. ## Re: Utility to write huge files instantly???

On Jan 24, 6:52*pm, Bill Todd wrote:
> robertwess...@yahoo.com wrote:
> > Well, the idea was that he'd be able to allocate space to a file
> > without having the OS zero all that space.

>
> Yes - and there's no intrinsic reason why SetFileValidData should have
> any effect on that: *the normal byte-granularity end-of-data marker
> should suffice regardless of where the allocation ends, unless it's
> maintained only as a small integer offset within the last file cluster
> rather than as a 64-bit integer.

My interest in testing this behavior has limits, and I didn't actually
try it, but my understanding is that SetFileValidData will cause the
file space that's been allocated, but not yet written into or
initialized, to be marked as valid. IOW, bypassing the zeroing. I'd
try it, but am insufficiently motivated to jump through the required
security hoops.

> > allocate space without zeroing it (but will read it as zeros), so long
> > as the volume is NTFS. *That makes a certain sense, since there's no
> > place in FAT to store such information.

>
> A quick search of mike's postings to other newsgroups seems to indicate
> that his problem (zeroing the space allocated) occurred at Close time,
> not at allocation time. *He says that SetFileValidData (to a small
> value) before closing the file did alleviate that, but not whether it
> also released the unused space at Close (which would be consistent with,
> say, maintaining ValidDataLength only in RAM rather than on disk).
>
>
>
> > He does appear to zero pages as needed if you write actual data in the
> > file. *So while the allocation is very quick, seeking to the end of
> > the file and writing a byte gets the entire file zeroed.

>
> > OTOH, you *can* read all the zeros without delay.

>
> Hmmm. *If you can do that *without* using SetFileValidData, then
> apparently SetEndOfFile is moving the end-of-data mark rather than just
> allocating space - unlike (I think, though I haven't tried it) the case
> with using the NtCreateFile approach with an AllocationSize.
>
> And in that case using SetFileValidData to move the end-of-data mark
> back to the start of the file would avoid the zeroing on Close that mike
> saw (without necessarily deallocating the space).

It's definitely allocating space. The free space on the memory stick
goes down, and you appear to get a nice big contiguous allocation.

I did a little checking, and the mechanism is actually pretty straight
forward. In NTFS, a file is defined by a collection of "attributes"
in a MFT block. File contents are stored in $Data attributes, of while there can be more than one (in fact, one is needed for each contiguous block of allocated block disk space). Attributes come in two flavors - resident and non-resident. Resident attributes are stored completely within the MFT, and are interesting mainly for small files (so you'd likely be able to store the data from a 100 byte file completely within the MFT). A non-resident attribute is a pointer to the run of blocks on the disk where the attribute is actually stored. A non-resident attribute includes (among other things): Allocated size, Actual Size, and Initialized Size. Initialized size gets set to zero when you allocate space with the SetFilePointer technique, and increased as needed (by actually initializing the space to zero) to fill up any space at the beginning when data gets written into that run. My understanding is that in NTFS, SetFileValidData just bumps the Initialized Size in the affected$Data attributes as needed (assuming the space is already allocated).

18. ## Re: Utility to write huge files instantly???

Bill Todd wrote:
>
> ...
>
>> Is that how things worked out for you?

>
> I can see how you might think that your own deficiencies (to the degree
> that you recognize them at all) tend to be shared by the rest of the
> world as well, and - regrettably - you might even be correct a lot of
> the time in that assessment.
>
> But not in this case (again, since you asked). Feel free to post a
> final inane comment now if that's important to you.
>
> - bill

So is this statement correct- "you never capable in you technical job, at
the start and where you are now, even after decades of being in the
field"?

19. ## Re: Utility to write huge files instantly???

In article ,
>
>So is this statement correct- "you never capable in you technical job, at
>the start and where you are now, even after decades of being in the
>field"?

No. The statement is malformed: the first clause lacks a verb, and appears
use the second person personal pronoun as the second person posessive
pronoun.

But don't feel bad. With practice, you may learn to construct correct
statements in English.

--
Thor Lancelot Simon tls@rek.tjls.com

"The inconsistency is startling, though admittedly, if consistency is to
be abandoned or transcended, there is no problem." - Noam Chomsky

20. ## Re: Utility to write huge files instantly???

In article ,
>
>So is this statement correct- "you never capable in you technical job, at
>the start and where you are now, even after decades of being in the
>field"?

No. The statement is malformed: the first clause lacks a verb, and appears
to use the second person personal pronoun as the second person posessive
pronoun.

But don't feel bad. With practice, you may learn to construct correct
statements in English.

--
Thor Lancelot Simon tls@rek.tjls.com

"The inconsistency is startling, though admittedly, if consistency is to
be abandoned or transcended, there is no problem." - Noam Chomsky