View on GitHub

Quorten Blog 1

First blog for all Quorten's blog-like writings

Okay, just checking. How do you create a PDF from a Word document in early versions of Microsoft Word that have hyperlinks? The short answer is that you can’t, you can only create PDF documents without hyperlinks.

20170508/DuckDuckGo microsoft word print postscript pdf bookmarks hyperlinks
20170508/https://superuser.com/questions/95222/from-word-to-pdf-including-bookmarks
20170508/http://www.schmitz-huebsch.net/gs4word/index.html
20170508/http://www.schmitz-huebsch.net/gs4word_de/gs4word.html

Oh wait, can’t CutePDF do that? Well maybe it can. Uh, don’t botter, it’s proprietary and dangerous too.

20170511/https://en.wikipedia.org/wiki/CutePDF

  • About the whole science thing. So why the science education in schools? Oh, we are told from IEEE Spectrum that scientists are a much more political bunch than engineers. No wonder why they got “science” into the school curriculum. And no wonder why they lobby on the streets in Washington D.C. but engineers don’t.

  • inven-sys On the spatial inventory system and navigation. Why constrain the user to a physical metaphor? They should be able to open up different rooms in different windows and drag and drop to move objects around, not have to move around. Like a so-called “spatial file manager.”

  • Oh my goodness. The problem with mass media. Yeah, it’s like you said, it’s all in the marketing. But also like you were thinking previously but did not yet write down. The reality of the media appeal and cultural relation among people? What causes people to take some reactions as complements and others as insults? Why do they just want to “enjoy” a story? Why don’t they want to understand the technical reality of language translation? Why do they just want to hear the story in their own language? It’s like people looking at themself in the mirror. What they see is that echoes their own self and psychology is “good.” “But it’s not real!” you say. Yeah, but that is the reality entertainment. It’s not about looking at things objectively as they actually occur in the real world whether they like them or not. It’s about seeing what they want to see, it’s the “pick and choose” culture of looking at what interests you.

Read on →

  • 3D scanning? Where are your applications in a post-industrialized society. Medical profession. Because we know, employment and economics in that sector roughly tracks that of the human population, which is known to always be growing.

  • Come on! 3D software can’t interoperate? That is terrible. So why is it not terrible in the case of human natural languages? Well, that’s because language communication happens at human speeds, and relatively speaking, it is fast and easy for humans to learn new languages, hence it is not seen as of critical as a problem. But in the case of computers, everything has ought to be operating on superspeed. So, even the minor delay by having two computer graphics software unable to interoperate with each other and bringing in an engineer to reverse engineer the two file formats so as to implement a format converter is an unacceptable delay and expenditure.

    It renders for a user experience at a double F minus.

  • 3D graphics photo compression. Animations, we can’t really do, but still photos we can. You’re right. That’s totally correct, when you think about the whole pipeline. The lack of quality in animated motions, the computational expense, and the relatively small number of still photos. It all makes sense from a technological standpoint.

    So basically, you’re saying you can setup 3D time-lapse stills first, and then add in the animation later. Also, an audio slide show with the 3D rendered stills also works quite well.

Read on →

So now you’re wondering about 3D printer design. How does the optical positioning encoder work in an inkjet printer? Well, let’s do a search. Surely we’ll find some useful information. Ah yes, I’m told that the inkjet printers use a quadrature optical encoder. Yes, indeed I’m being told that this is to be used with an optical alignment strip and LEDs. Also it’s interesting that the optical encoders are purpose-built hardware that are calibrated to specific optical positioning patterns that come with the printer. So it’s not like it’s something that the printer manufacturer fully understands, in other words, nor is that the case with the software running in the printer.

  • Also, this is important too. Commentary on why DC motors are used in modern inkjet printers rather than stepper motors. DC motors are stronger, faster, and cheaper. So that explains it! That’s why our newer HP printer is so much faster than our older Epson printer.

    Also, DC motors can generate less heat than stepper motors.

20170503/DuckDuckGo inkjet printer optical position encoder
20170503/http://www.reprap.org/wiki/Microstepping_with_optical_feedback
20170503/https://electronics.stackexchange.com/questions/208434/inkjet-printer-and-digital-encoder

Read on →

This is pretty interesting. An libre open-source UV metering system. It only meters ambient UV light levels, it does not provide you with an image. Hence it saves greatly on cost, in other words. On the other hand, it is still a fairly expensive gadget to purchase in addition to a Raspberry Pi Zero.

20170430/https://www.open-electronics.org/save-your-skin-with-this-open-source-uv-index-detector/
20170430/https://store.open-electronics.org/Module-UV-sensor-ML8511

New Raspberry Pi rival from Huawei? Only a rival if you think $240 is a rival price-point.

20170430/https://www.open-electronics.org/the-brand-new-raspberry-pi-rival-huawei-hikey-960/
20170430/http://www.96boards.org/product/hikey960/

First of all, I’d like to read more about “well-known” techniques for reducing laser speckle. But, I’m not sure that I can get to those references from the Stanford space-time research paper. No problem, I can search the Internet instead. I’ve found some very good sources, the first one in particular is the best. The second one describes using a colloidal solution to disperse a laser to reduce speckle. Not particularly interesting for my use cases.

But the second paper does bring up an interesting point. In order to achieve good speckle-free results, 600 different speckle patterns need to be averaged together. So, simply shifting a laser to two different positions does not sufficiently achieve this requirement.

20170430/DuckDuckGo laser speckle reduction techniques
20170430/http://www.siliconlight.com/wp-content/themes/siliconlight/pdf/speckle-spie.pdf
20170430/https://arxiv.org/pdf/1212.5176

There is certainly no shortage of published research papers on lasers, but I think I’ve seen enough to get the idea.

Read on →

  • Actually trying out my visual inventory system. It’s amazing how much great amounts of detail can be contained within a single photograph. I have to self-profess the system as amazing. My test: If I can get all these objects back in the bag exactly how they came out, then the system is amazing. And guess what? The system delivered on that goal.

  • Biased image coverage. Google Street View, Google Earth. Take, for example, Google Earth. Google Earth provides ample high resolution imagery over land and cityscapes, but the resolution coverage over the ocean is considerably lower. Yet, oceans over over 75% of the Earth’s surface. So there are some huge biases at work in Google Earth, but at least you can calculate them and determine that they are there in the sample data.

    Going forward into the future, should you so desire, you could rescan those areas at the same resolution as the land areas. But for past data, we don’t have any such solution. With current technologies (as of 2017), you can’t travel back in time to rescan those areas at their full resolution. Surely it would be awesome if we could travel to ancient times and scan the Earth! Unfortunately, we can’t do that. All we can do is point out that we don’t have data at that time era because at the time, technology wasn’t advanced enough to collect that kind of data. Then we can factor that into our decision-making that our decisions are being biased by only considering present data that we have at high resolution, even though the past world existed. It’s just that we are not considering the past world.

About languages and internationalization. Unicode is touted to only cover living languages, not ancient languages. Yeah, that may have seemed reasonable at the time that Unicode was first being developed, but now that it’s an international standard, what about the future? What about when present-day living languages become archaic languages? Will Unicode still provide coverage over those languages, or will it somehow deprecate the old meaning of the codes and re-assign them to make room for new languages? Well, this really hinges on the future of technology. At the time that Unicode was introduced, computers were a very primitive technology that could barely support 16-bit Unicode due to severe memory limitations. Many compromises were taken, such as Han Unification, to try to squash the full spectrum of Unicode into computer systems that were actually too small to work with it. But, technology has since considerably advanced, and now we are adding colored Emoji to the Unicode Standard!

So yes, if future computers are far more powerful than present-day computers, it could just as well be the case that Unicode maintains support for past languages that were a living language at times past, but are now archaic languages. There would be no logical reason to say that those codes are invalid since there would be more than enough address space for the new language characters to expand into. The standard might say that it is not required for conforming implementations to render those codes, but there would be no need to reassign them.

Read on →

  • Smart phones need “front-view mirrors.” Seriously, that would greatly help prevent the problem of people with their heads down looking at their phones, walking forward at the same time, and almost running into you. The human visual system simply isn’t designed to see over the head. That’s very much a blind spot, and in fact, the bind spots where the optic nerves come together in a bundle in your retina are located at the top of you vision.

    • So why do people like to hold their mobile phones down in front of them with their arms? Why not hold them forward where they can see where they are walking? Well, we know the answer that to that. “Remember the gorilla arm.” It is far more comfortable to hold your arms down low than it is to hold them up high. So actually, when you see people walking around with their arms down and heads down at their smart phone, remember they are doing so because it feels more ergonomic.

    That being said, if someone wants good ergonomics and safety at the same time when walking on a mobile phone, one needs to use “front-view mirrors.” I personally wouldn’t mind if I were approached by someone with a smart phone using front view mirrors, because I would be able to see their face through the front-view mirrors as they would be able to see mine, and it would be very clear from their walking navigational decisions whether they are actually trying to avoid walking into me or not.

  • UPDATE 2019-11-21: Indeed this has become a thing with some smartphones.