![]() ![]() You still there or some demented user disconnected us in the meantime? Over. Ok, I am going to send you some data, Over.įine, I am ready to receive the data, Over. Hi I am the motherboard controller, Roger and Over. Hi, I am a USB Mass Storage device, are you a motherboard controller? Over. Hi, I am the motherboard controller, anyone connected to this bus? Over. The data is transferred in "frames", each frame is made of three "packets" (of which the middle one or "data" one) is "optional":įrom what I understand, the bus is "half-duplex", in an extremely simplified way, think of the differences between a "normal" telephone conversation when compared against one over a walkie-talkie: Then again, since now that we're discussing it, would Wonko kindly give it a try, please?You want a theoretical explanation of a practical issue? What is the nature of the overhead on the USB 2.0 interface? I did search the web for any explanation, but came back empty-handed, so far. The standard talks about 480 Mb/s, which could be 60 MB/s (or 48 MB/s in case a 8-in-10 bit encoding is perhaps used). The "gap" between theoretical USB 2.0 speed and USB 3.0 speed is so large that I wouldn't be entirely surprised if - when comparing the performance of a USB 2.0 (adapter/interface/port) actually built in (say) 2001 against a correspondent piece of hardware built in (still say) 2012 - it would come out that the first barely reaches standard speeds while the latter exceeds them.Īnother thing that we really do not know (or know in detail) is how exactly the "Turbo" drivers for selected USB 2.0 devices manage on same hardware to get a speed increase (and how much of that has been later - hypothetically - transferred to the actual firmware/controller of the device/port): The USB 2.0 standard (besides having been managed in a far from "standard" way) is not really-really "static" and unchanged since 2000: I mean, is the USB "standard" a "forced speed cap" or a "top speed" or a "minimal speed"? Just to add some uncertainty to the matter how can anyone say that a USB 2.0 is really a USB 2.0 (and not - as an example - a USB 2. **The same applies equally to USB 3.0 pendrives in "USB 2.1" mode** It does default to "2.1" mode, but remains faster than any real USB 2.0 would ever be. Hence, differently from an USB 2.0 adapter, a USB 3.0 adapter connected to a USB 2.0 port will present almost no overhead at all. This also means they're insanely faster than required to deal with USB 2.0 and, when operating in "USB 2.1" mode use the USB 2.0 signaling and protocols, but present virtually zero hysteresis and minimal overhead, because "USB 2.1" do not require them to slow down more than what needed to cope with the 4 cable interface, and what will end up limiting the final transference-ratio is the motherboard soutbridge USB controller or the add-on card (when present). But USB 2.0 controllers were made to cope with the USB 2.0 standard, whereas those for USB 3.0 needed to be much faster, internally, to be able to abide by the newer USB 3.0 standard. being a one type fits all interface, for instance) seem to have been paramount in the minds of the people who defined the standard. Well, USB 2.0 is from 2000, and even at that time it's questionable whether it represented the state-of-art of data transmission, since other matters ( viz. Theoretically an 3.0 reader should default to 2.0 when used with a 2.0 port. Is Dell U2410F Monitor Card-Reader an USB 2.0 one?ĭon't have any idea how to explain this.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |