Upgrading a deprecated software or the tale of Flamingo (Part 1)

TL;DR: Code is available on Github

Flamingo

 

Flamingo is a XMPP client released in 2012 which differentiate itself with a high quality design. Considering the competition consisted almost solely of Adium, a barely maintained, very old (probably the early OS X days) project based on libpurple. From a user experience perspective, Flamingo was light years ahead, but two problems prevented a migration to be a no brainer.

Missing features

While supporting a fair amount of modern platforms at the moment of its release, it was missing a very important features: OTR Integration.

OTR

According to Wikipedia, « Off-the-Record Messaging (OTR) is a cryptographic protocol that provides encryption for instant messaging conversations ». This protocol has a couple of compromise (no delayed message, only one client online at any point in time…) but once those factored in, there has been no successful cryptographic OR implementation attack in recent memory. The most common library implementing the protocol, libOTR is well regarded and despite some raging design making its use a little bit cumbersome from a developer standpoint, it is the best way I’m aware of to protect an IM discussion.

Adium come with a reference plugin but Flamingo’s developer didn’t provide a proper integration with their homemade XMPP library.

Deprecated

In mid-2015, Flamingo’s developer announced they were not going to continue working on the project and offered to sell it to someone else. One year later, the project still appears to be in limbo and various email to offer making the project open source did not received any response and thus, this article came to be.

Project

My goal, when starting working on this project was to inject into Flamingo seamlessly (from a user standpoint, it must be totally invisible), add a small button in their UI to control a libOTR based wrapper which would, thanks to a couple of hooks, enable Flamingo to use OTR. This post will probably be posted in several parts, one for each milestone and I’ll do my best to explain the approach taken and why it worked, but also what didn’t.

Injection

 

Injection vehicule

My first idea in order to inject code in Flamingo was to write a small framework that would wake up early in the loading process and would hook as early as possible. However, adding a framework wouldn’t be enough as it wouldn’t be linked to the binary and tweaking the main binary flags to make it load our framework would break code signature. Stripping the code signature was an option but code signing is there for a good reasons and while a possibility, I decided that I wanted to explore other alternatives before manipulating anything in realm of the code signature.

The dynamic library path was the most stable vehicle but if framework wouldn’t cut it, what about a simple dylib? This would work but I’d still have to find a way to perform the actual injection.

Test library

In order to test whenever I would actually achieve code execution, I wrote a very simple library writing to the log whenever it was loaded or unloaded. In order to achieve that, I used the two very simple attributes: __attribute__((constructor)) and __attribute__((destructor)). They enable you to specify which function is to act as the library constructor and destructor. Those are called very early and significantly before main(). Thankfully, probably because of the priority orders, all the other frameworks are already loaded and thus, injection is possible at this stage.

A small sample code is included.

Once the library compiled (with clang in Xcode in my case), it was time to inject it.

Injection

One of the reasons that made me tide to the side of the dylib was that I was aware there were ways to tell dyld (dynamic ld, OS X macOS’ dynamic linker) to add a given library to every binary it loads. After some research, I figured could be done using the dreaded DYLD_* environment variables.

Dreaded

Those variables are useful, but also extremely dangerous, as they enable a total takeover of any program spawned for the environment in which the variable is set: that’s an incredibly powerful persistence mechanism, which was used a couple of time to publicly compromise iOS’ and macOS’ security. Because they were so powerful, I didn’t want to set them system wide (it’d also rise eyebrows in the security/privacy community which would be the prime users of this plugin).

However, it was the only way I could think of not involving a kernel extension or an other active monitoring solution which would be as hard to swallow, with the added inconvenient of respectively increased attack surface in the kernel and loss of responsiveness/battery. Finally, those two solutions made the installation process even more cumbersome and were thus quickly discarded.

DYLD_INSERT_LIBRARIES

Of the whole DYLD_ family, DYLD_INSERT_LIBRARIES was the most interesting environment variable as it tells dyld to inject any libraries it contains in any app spawned. It’s limited by the scope in which you could set the variable but still incredibly powerful.

At first, as a proof of concept, I tried to inject system wide DYLD_INSERT_LIBRARIES=”/usr/local/***/test.dylib”.

System-wide injection

First, I saw some old mentions on mailing list claiming that setting variables in ~/.MacOSX/environment.plist would be the way to fo it but this path was deprecated as of 10.10.5 (no idea of the version at which it became the case).

I realized that the modern way is to use launchctl setenv (as a side note, achieving persistence through launchctl rely on adding a LaunchAgents for a users which would perform the various calls). However, I was surprised to discover that setting DYLD_INSERT_LIBRARIES with launchctl setenv wouldn’t actually set the variable, returning an empty string with getenv. Thankfully, this part of macOS is open source and I could dive and try to understand what was going on.

I first looked directly at dyld source code (not realizing the variable wasn’t actually set) and if dyld has some restrictions on when it would accept or not the injection (SUID binaries among others), this wasn’t the culprit. This is the moment launchd comes in. For security reasons, envitem_new (line 5926) will discard any variable with the wildcard ‘DYLD_’ if ‘launchd_allow_global_dyld_envvars’ isn’t set.

This variable is by default set to false, unless:

  • The file /private/var/db/.launchd_allow_global_dyld_envvars exist (the directory isn’t protect by SIP so easy)
  • pid1_magic = false, which happens to mean that launchd PID must not be 1. Sadly, launchd IS PID 1 and thus, there is no reliable, persistant, way to get launchd to accept our environment variable, or is it?

Bypassing launchctl

Launchd is not bypassable, as it’s the foundation of the userland launching mechanism and killing it (so it respawn with a non-1 PID) not realistic in a production environment. However, launchctl isn’t the only to pass an environment variable to a binary when launching it.

After some despair, I ran into an helpful stackoverflow topic which mentioned that you could pass a path to a program when launching it in bash by using the following syntax: ‘PATH=”XXXXXX” ./binary”. Interesting, could it work for us? Spoiler alert: it does.

The script

The problem is now to find how to highjack the launch process to insert a small script. The first thing I tried was to add a shell script in Flamingo.app/Contents/MacOS and redirect the launch flow by tweaking Info.plist but sadly, it broke the code signature.

Then, I decided to create a separate application bundle, with the same icon and apparently identical to the original app but embedding the real .app in its Resources directory, along with the library. This started well but for some reasons, still on OS X 10.10.5, OS X couldn’t identify the shell script as a ASCII script and, not recognizing a correct x86 Mach-O file, probably returned EBADARCH resulting in the UI complaining about a PowerPC binary.

After that, I moved to using system() in a very small C script, which had the added bonus of enabling me to ask Xcode to sign the injection bundle, preventing tampering in the application performing the tampering (tampecepetion?). The third try was the good one and ‘Hello world’ appeared in the system console. We had injection! With the added bonus of a seamless build procedure. The only issue was that because the injected app was launched separately of the injector, the debugger isn’t attached to the application, and have to be manually attached at every instance I wanted to break into but hey, that’s already great progress!

Next time, I’ll write about to start messing with Flamingo once injected in the process without breaking everything.

I’m not sure I’ll bother writing the hooking and patching part, as it’s fairly specific and straightforward. If you read to this point and would care about this, drop me a message!

The future of computing

The release of the new iPad Pro, coupled with Apple communications trying to sell old PC users to iPad sparked some new debates on whatever this platform will overtake the world given the proper tools (the most realistic one being Xcode for iPad and the ability to side load apps (on which I don’t agree but whatever)). I initially wanted to unleash my (hotter) take on Twitter but instead decided to take some time to put it down in clearer words.

Do I think that PCs in general are going to get hurt by tablets? Yes. They already were by smartphones. Do I think that tablets will replace the roles the PC held despite smartphones? In some extent.

However, I don’t believe that PC are going to die, simply because mainframe and workstation still exist today, despite smaller and cheaper computing platforms destroying the hegemony of their time. Just like in those two cases, the large majority of their users will move to a new, cheaper, simpler solution because they don’t need the unique advantages of the platform.

Moore’s law is amazing, and despite whatever the press may say is not going away anytime soon. However, a 30W chip is still a 30W chip, no matter what you can fit in a 5W component, it’ll not beat a chip with 6× the thermal headroom (and I’m not even talking about the 90W-140W chips you can find on high-end desktop computers). If you need raw power to compute, compile, render…, the cloud isn’t going to cut it either (hi mainframes, long time no see) because of its repeated costs that make it a deal breaker for any young enthusiast for work purposes (tablets are not a great replacement either because of battery/computing power constraints).

The platform constraints also makes it hard to envision a future where a game as long and complex than Star Citizen, Elite Dangerous or even Mass Effect could see the light of day (I have yet to see a commercial success sold 60€ of the AppStore just like on gaming consoles. And don’t even get me started on the size of the market, we know that iPad users don’t like to pay significantly more than iPhone users. The only games making a buck use IAP because nothing else really works on iOS). Pro tools are also a long stretch, only getting longer with time.

In my opinion, the worst case scenario for PC is that tablets get their acts together, and cannibalize the light laptop market (Macbook, Macbook Air, maybe 13″ Macbook Pro) and a significant portion of the desktop market (though the external display support and alternative inputting sources will need work) but won’t be able to take down the heavy duty, 1000+€ range of the PC & laptop market.

The question isn’t « can a tablet do what a PC can? » but « can a tablet do it better than a PC », and so far, more often than not, the answer is « not really ».

Again, I want to state this is only my opinion.

Tracking a user in subway without GPS signal

The subway is a strange place: very common in most of the large cities, faster and more reliable than busses, but yet, by its very nature, a challenge to use by navigation apps. By being underground most of the time, there is no GPS or WiFi signal, and LTE/3G is sporadic at best (beside a couple of cities, but far from the standard) and 2G signal would massively increase the battery strain. Moreover, I’m unsure the reception would be good enough to perform a localization precise enough. The builtin location service is thus out of question.

I have no time to work on a direction app so I’ll just drop the concept that came to my mind here, free of any licence (but probably qualify as ‘previous work’ and thus could be use to challenge a potential future patent) in the hope someone pick it up and implement it.

Continue reading

Music or keeping the stream of thought intact despite interruption

While working this afternoon on a system while being especially focused, I received a couple of IM, and reading them was enough to make me loose the thought of what I was trying to code.

However, the time for the song I had on my headphone to loop back (a short very catchy song, that future reversers of Rakshata may stumble upon) was enough not only to bring me back to the thought of the system architecture but also to bring my typing speed back to what it was before the notification (the most significant consequence of the distraction was a brutal decrease of my speed).

It happened a couple of time the during the session and I’m wondering if this was simply a coincidence or if this also happened to you.

 

Care to comment?

Twitter vs RSS

Twitter and RSS can seems to have a pretty close usage: keep up with someone/thing else activity (I won’t talk about Facebook here because I don’t use it), and since major website are on Twitter, there is literally nothing you can’t do with Twitter. The opposite is not true but for a 15 years old standard, this is not a big surprise.

However, I still check several times a day my RSS client to check about news and various other source of data, and I often see people not understanding why keeping two separate sources: all this websites are available on Twitter, so why bother with more sources? I’ll try to answer this here.

I. I read my whole timeline

I don’t follow many people, but I really consider every single one and read everything, which may sometimes take a lot of time. Usually, I get between 100 and 150 tweets in the morning and as I don’t have much time, so I’ve to keep the Time per Tweet as low as possible. Usually, links end up in Pocket and I catch up on them later. However, I can’t throw daily news on Pocket because they’ll either get obsolete quickly, or just makes it impossible to catch up on other interesting stuffs.

II. Time management

I don’t check feedly as I check Twitter, I know I spend ~ 10 seconds by tweets and Pocket can handle external ressources. However, I know that checking ~50 elements on feedly can easily take an hour, so I’ve a look at it when I know I can read the whole thing and have a lot of available time. Merging them would prevent me to catch up.

I had to write something so here is it, I’ll probably add things to this post when I think about them but this is the core idea: time management.

I was trying to write a much bigger article but I lost inspiration so, this is all we have :/

Seriously, I need to write more

Well, I know this blog is quite dead lately…  this is due to three elements:

– Lack of time: I’m setting pretty awesome things up

– Lack of topics: Even if I deal with a bunches of things, most of them are related to cryptography and I don’t want to disclose too much details about Rakshata’s cryptosystem. I’m sadly not good enough to create a system which wouldn’t have to rely at least a little bit on obscurity to stay ‘secure’

– Self-censorship: I became a big brony since a few months (I guess that anyone following my post on Rakshata’s website guessed it) so I try not to flood this blog with everything related to those awesome creatures but that participate to the lack of topics to talk about.

 

I’m still pretty active on twitter (here) and I’ll try to figure out interesting topics about which I can actually add anything to actual knowledges. but I sadly can’t promise anything :-/

 

See you in ~ 6 months 😉

 

Taiki

Chromium Update

Edit 26/02/2013: Updated to launch the binary itself

Edit 16/06/2016: Moved the code to Github

I used to be a proud user of Firefox but updates after updates, the slowness finally kicked me out of its userbase
After that, I started to look for something else. And there is only one big opponent to Firefox: Chrome… But well, there is an issue.

I’m not a big fan of Google.

I use some Google services like Gmail (even if on my 4 email address, I host 3 on my server), YouTube, Reader (not for long, I’m looking for something else) and Search but I stopped to use Drive and other services.

The main reason is that I’m against their business model: they make money with what they know about you.
I’ve an iPhone, 3 computers running W7 (maybe one W8 update someday) and Fedora but nothing from Google.
Why pay for a phone twice more expensive than an Android device? Google. I read an interesting blog post some times ago: « The main difference between Google, Amazon, Apple and Microsoft is that Apple and Microsoft want to sell devices and/or software. Amazon want you to buy things on their website and Google to show you ads. ». When you buy an Apple/MS product, they are happy, when you buy a Kindle/Android thing, it just the beginning.

But, I want a new good browser!

Chromium

Chromium is the OpenSource project under Chrome: Google takes Chromium source code, add few proprietary API and tracking stuffs then change the logo and release it.

Great! I found my new best browser… but everything seems too amazing, a little bit like when you finish a math test at the half of the time, it must be a trick somewhere…

No auto-update 🙁

Chrome can update himself, silently and quite fast but it seems to be a part of the Google code because Chromium can’t. And if you’re, like me, quite paranoiac about security, it’s a huge issue, does that mean that I’ll have to find a new browser, when I was so close to find my goal?

Nah, I’m not a developer for nothing, I can create something to update Chromium.

I spent few hours to figure out how everything work and I made a small updater. It’s not perfect, only work on Windows, need to be compiled for every computer and can’t start the update itself but it works.
Details are on the README file

Grab the file here

It’s not on github because I’m a git-noob but you can submit your patches 🙂

Enjoy and tell me what you think about it

Taiki ~