<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[The (not so) Weekly Fitz]]></title><description><![CDATA[Thoughts, stories and ideas.]]></description><link>https://weekly.elfitz.com/</link><generator>Ghost 5.79</generator><lastBuildDate>Tue, 20 Feb 2024 10:14:33 GMT</lastBuildDate><atom:link href="https://weekly.elfitz.com/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Types are nice, infinite recursion edition]]></title><description><![CDATA[Playing around on hackerrank to prepare for an interview, I had some fun solving tree problems using recursion.]]></description><link>https://weekly.elfitz.com/2023/08/25/types-are-nice-infinite-recursion-edition/</link><guid isPermaLink="false">64e8b78050fe200001751037</guid><category><![CDATA[Javascript]]></category><category><![CDATA[Programming]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Fri, 25 Aug 2023 14:19:05 GMT</pubDate><media:content url="https://weekly.elfitz.com/content/images/2023/08/6A2C087A-BBF5-4A75-8883-A4D746E770B2.png" medium="image"/><content:encoded><![CDATA[<pre><code class="language-shell">RangeError: Maximum call stack size exceeded</code></pre><img src="https://weekly.elfitz.com/content/images/2023/08/6A2C087A-BBF5-4A75-8883-A4D746E770B2.png" alt="Types are nice, infinite recursion edition"><p>Can you spot what&apos;s wrong here?</p><pre><code class="language-javascript">/*
    Node is defined as
    var Node = function(data) {
        this.data = data;
        this.left = null;
        this.right = null;
    }
*/

function treeHeight(root) {
    if (!root) { return 0 }
    return 1 + Math.max(treeHeight(root.left), treeHeight(root).right)
}
</code></pre><p>Absolutely, yes, line 12:</p><pre><code class="language-javascript">return 1 + Math.max(treeHeight(root.left), treeHeight(root).right)</code></pre><p>A misplaced parenthesis, meaning this recursive function has no exit condition. Instead of calling itself with both <code>root.left</code> and <code>root.right</code>, it will call itself with <code>root.left</code>and <code>root</code>, the input parameter. Until it blows the stack.</p><p>And since a <code>number</code> has no <code>right</code> property, proper typing would have easily avoided that.</p><p>&#xAF;\_(&#x30C4;)_/&#xAF;</p><figure class="kg-card kg-code-card"><pre><code class="language-javascript">function treeHeight(root) {
    if (!root) { return 0 }
    return 1 + Math.max(treeHeight(root.left), treeHeight(root.right))
}</code></pre><figcaption>Without typo</figcaption></figure>]]></content:encoded></item><item><title><![CDATA[Going further with ChatGPT]]></title><description><![CDATA[Discover the untapped potential of ChatGPT in our latest blog post, and learn how this tool can transform the way you create diagrams and rephrase text, simplifying tedious tasks and enhancing your content!]]></description><link>https://weekly.elfitz.com/2023/04/21/going-further-with-chatgpt/</link><guid isPermaLink="false">643c3988542ce9003d3100ca</guid><category><![CDATA[ChatGPT]]></category><category><![CDATA[GPT]]></category><category><![CDATA[AI]]></category><category><![CDATA[OpenAI]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Fri, 21 Apr 2023 12:30:50 GMT</pubDate><media:content url="https://weekly.elfitz.com/content/images/2023/04/going-further-with-chat-gpt.png" medium="image"/><content:encoded><![CDATA[<img src="https://weekly.elfitz.com/content/images/2023/04/going-further-with-chat-gpt.png" alt="Going further with ChatGPT"><p>Many people seem to try and ask ChatGPT for answers based on what it <em>knows</em>. And quite a few end up frustrated by it&apos;s tendancy to confidently and credibly hallucinate facts, answers, and sometimes whole scientific papers.</p><p>Except that ChatGPT isn&apos;t an encyclopedia. It can give some interesting answers, for sure, but I&apos;m way more interested and confident in what it can <em>do</em> than what it <em>knows</em>.</p><h2 id="diagrams">Diagrams</h2><p>As you may have noticed, <a href="https://weekly.elfitz.com/2023/04/16/conditionally-debounce-value-updates/">my previous post</a> contained, probably for the first time, a diagram. Now, these things are quite useful and helpful when it comes to explaining, clarifying, and understanding concepts</p><p>Diagrams, wether sequence diagrams, decision trees or UML, are nice way of representing how a system works, and can help understand complex systems and time-sensitive interactions.</p><p>But, <a href="https://www.reddit.com/r/computerscience/comments/12phywc/what_are_some_practical_benefits_of_uml/jgmhuom/?ref=weekly.elfitz.com">as explained by a kind redditor</a>, when used for documentation these diagrams will need to be kept up to date, which will eventually get delayed, or dropped altogether.</p><p>Personally, I had given up. I had always found them tedious to make, even when using <a href="https://mermaid.js.org/?ref=weekly.elfitz.com">Mermaid</a> and <a href="https://mermaid.live/?ref=weekly.elfitz.com">it&apos;s live editor</a>. So nearly all of my previous blog posts did without, and so did most of the documentation I&apos;ve every written.</p><p>Well, no more.</p><h2></h2><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://weekly.elfitz.com/content/images/2023/04/debounce-decision-tree-1.png" class="kg-image" alt="Going further with ChatGPT" loading="lazy" width="991" height="444" srcset="https://weekly.elfitz.com/content/images/size/w600/2023/04/debounce-decision-tree-1.png 600w, https://weekly.elfitz.com/content/images/2023/04/debounce-decision-tree-1.png 991w" sizes="(min-width: 720px) 720px"><figcaption>The diagram in question</figcaption></figure><p>What changed? I can now get these diagrams without having to make them. With a single line and some copy-pasting.</p><blockquote>Represent the following process&apos; decision tree using mermaid:<br>- On a new frame, try to detect the features<br>- If the features are found, compare with the previous stored result<br>- If the the new results are different from the old ones, update the stored result with the new values, enable the camera shutter button (and, on press, use these results)<br>- If the features aren&apos;t found, wait 0.3 seconds before wiping the stored result and disabling the camera shutter button</blockquote><p>Yielding us the following:</p><!--kg-card-begin: markdown--><p><em>Here&apos;s the decision tree represented using Mermaid syntax:</em></p>
<pre><code class="language-css">graph TD
A[Detect Features on New Frame] --&gt;|Features Found| B[Compare with Previous Stored Result]
B --&gt;|Results are Different| C[Update Stored Result and Enable Camera Shutter Button]
B --&gt;|Results are the Same| D[Do Nothing]
A --&gt;|Features Not Found| E[Wait 0.3 seconds]
E --&gt; F[Wipe Stored Result and Disable Camera Shutter Button]
</code></pre>
<p><em>You can copy and paste the above code into the Mermaid Live Editor (<a href="https://mermaid-js.github.io/mermaid-live-editor/?ref=weekly.elfitz.com">https://mermaid-js.github.io/mermaid-live-editor/</a>) to see a visual representation of the decision tree.</em></p>
<!--kg-card-end: markdown--><p>Paste it in <a href="https://mermaid.live/?ref=weekly.elfitz.com">mermaid.live</a>, make a screenshot, and you&apos;re done!</p><p>Now, it isn&apos;t perfect (here the code block&apos;s language is said to be css, when it obviously isn&apos;t), and syntax errors sometimes find their way into the output (such as parentheses in sequence diagrams). But it certainly beats doing it all by hand.</p><h2 id="rephrasing">Rephrasing</h2><p>Bullet points are a simple way of writing things down, especially processes, but there&apos;s a reason Amazon favors narrative structures for it&apos;s memo over Powerpoints and lists.</p><p>So let&apos;s see how we can rephrase our processs above.</p><h3 id="first-try">First try</h3><h4 id="prompt"><br> Prompt</h4><blockquote>Describe this process in a narrative format:<br>On a new frame, try to detect the features<br>If the features are found, compare with the previous stored result<br>If the the new results are different from the old ones, update the stored result with the new values, enable the camera shutter button (and, on press, use these results)<br>If the features aren&apos;t found, wait 0.3 seconds before wiping the stored result and disabling the camera shutter button</blockquote><h4 id="output">Output</h4><blockquote>Once upon a time, in the land of advanced technology, there was a camera application that could detect and analyze specific features within an image. This magical camera was designed to enhance the photography experience by making sure the captured image had the desired features.<br><br>One sunny day, as a user opened the camera application, the mystical process began. As the camera peered into the world through its lens, it diligently searched for the features it was programmed to recognize in each new frame it saw. It was like a curious explorer, tirelessly scanning its surroundings for treasures.<br><br>Whenever the camera successfully discovered the desired features within a frame, it sprang into action. It quickly compared the newly detected features with those from the previous frame it had stored in its memory. This step was crucial, as the camera wanted to make sure it had the most accurate and up-to-date information.<br><br>If the newly detected features were different from the ones stored in its memory, the camera updated its stored results with the fresh, new values. It then enabled the shutter button, as if proudly announcing to the user, &quot;I have found what you seek! You may now capture the perfect image!&quot; And upon pressing the shutter button, the camera would use these updated results to snap a picture.<br><br>However, there were times when the camera, despite its best efforts, could not find the desired features in a frame. In such moments, it patiently waited for 0.3 seconds before wiping the stored result from its memory and disabling the camera shutter button. This was the camera&apos;s way of saying, &quot;Hold on, I&apos;m still searching for the perfect shot. Please give me a moment to explore.&quot;<br><br>Thus, the camera application continued its quest to help users capture the best possible images by constantly searching, comparing, and updating its knowledge of the world. And in the end, the camera and its user lived happily ever after, snapping remarkable photos filled with wonderful features.</blockquote><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://weekly.elfitz.com/content/images/2023/04/a-creative-ai.png" class="kg-image" alt="Going further with ChatGPT" loading="lazy" width="1456" height="816" srcset="https://weekly.elfitz.com/content/images/size/w600/2023/04/a-creative-ai.png 600w, https://weekly.elfitz.com/content/images/size/w1000/2023/04/a-creative-ai.png 1000w, https://weekly.elfitz.com/content/images/2023/04/a-creative-ai.png 1456w" sizes="(min-width: 720px) 720px"><figcaption>GPT4 can be quite creative sometimes.</figcaption></figure><p>While funny, and quite creative, and I don&apos;t know about you, but this isn&apos;t <em>excatly</em> what I was hoping for. So let&apos;s try again!</p><h3 id="second-try">Second try</h3><h4 id="prompt-1">Prompt</h4><blockquote>Describe the following process without using bullet points:<br>- On a new frame, try to detect the features<br>- If the features are found, compare with the previous stored result<br>- If the the new results are different from the old ones, update the stored result with the new values, enable the camera shutter button (and, on press, use these results)<br>- If the features aren&apos;t found, wait 0.3 seconds before wiping the stored result and disabling the camera shutter button</blockquote><h4 id="output-1">Output</h4><blockquote>In this process, the first step is to detect the features on a new frame. When these features are successfully identified, they are compared with the previously stored result. If any differences are detected between the new and old results, the stored result is updated with the new values, and the camera shutter button is enabled. Consequently, pressing the shutter button will use these updated results. However, if the features cannot be found, the system waits for 0.3 seconds before clearing the stored result and disabling the camera shutter button.</blockquote><p>Much better!</p><h2 id="structuring-and-formatting-text">Structuring and formatting text</h2><p>Talking of formatting text. Let&apos;s say you&apos;ve transcribed some audio or some video using <a href="https://openai.com/research/whisper?ref=weekly.elfitz.com">Whisper</a>. Great. But honestly, it&apos;s probably a mess. And formatting it is, again, both tedious and boring.</p><p>Fret not.</p><h3 id="our-prompt">Our prompt:</h3><p>For the sake of brevity, I didn&apos;t include the whole speech, but it performed admirably well. It will perform just as well in quite a few other languages. Just remember to break up large texts if it behaves strangely.</p><blockquote>Structure the following text using Markdown:<br><br>&#x201C;I am honored to be with you today at your commencement from one of the finest universities in the world. I never graduated from college. Truth be told, this is the closest I&#x2019;ve ever gotten to a college graduation. Today I want to tell you three stories from my life. That&#x2019;s it. No big deal. Just three stories.The first story is about connecting the dots. I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit. So why did I drop out? It started before I was born. My biological mother was a young, unwed college graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife. Except that when I popped out they decided at the last minute that they really wanted a girl. So my parents, who were on a waiting list, got a call in the middle of the night asking: &#x201C;We have an unexpected baby boy; do you want him?&#x201D; They said: &#x201C;Of course.&#x201D; My biological mother later found out that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would someday go to college. [...]&#x201D;</blockquote><h3 id="the-result">The result</h3><p>And here&apos;s our output!</p><!--kg-card-begin: markdown--><h1 id="commencement-speech">Commencement Speech</h1>
<p>I am honored to be with you today at your commencement from one of the finest universities in the world. I never graduated from college. Truth be told, this is the closest I&#x2019;ve ever gotten to a college graduation. Today I want to tell you three stories from my life. That&#x2019;s it. No big deal. Just three stories.</p>
<h2 id="first-story-connecting-the-dots">First Story: Connecting the Dots</h2>
<p>The first story is about connecting the dots.</p>
<p>I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit. So why did I drop out? It started before I was born. My biological mother was a young, unwed college graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife. Except that when I popped out they decided at the last minute that they really wanted a girl.</p>
<p>So my parents, who were on a waiting list, got a call in the middle of the night asking: &#x201C;We have an unexpected baby boy; do you want him?&#x201D; They said: &#x201C;Of course.&#x201D; My biological mother later found out that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would someday go to college. [...]</p>
<!--kg-card-end: markdown--><h2 id="conclusion">Conclusion</h2><p>These are just two examples, and there are obviously many more. I&apos;ll &#xA0;put together something more comprehensive later on, but I hope this has at least given you some new ideas on how you could use <a href="https://openai.com/blog/chatgpt?ref=weekly.elfitz.com">ChatGPT</a> to simplify boring workflows and bring some improvements to what matters to you.</p><p>Enjoy your day, and have fun!</p><figure class="kg-card kg-gallery-card kg-width-wide kg-card-hascaption"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://weekly.elfitz.com/content/images/2023/04/midjourney-fun-1.png" width="2000" height="2000" loading="lazy" alt="Going further with ChatGPT" srcset="https://weekly.elfitz.com/content/images/size/w600/2023/04/midjourney-fun-1.png 600w, https://weekly.elfitz.com/content/images/size/w1000/2023/04/midjourney-fun-1.png 1000w, https://weekly.elfitz.com/content/images/size/w1600/2023/04/midjourney-fun-1.png 1600w, https://weekly.elfitz.com/content/images/2023/04/midjourney-fun-1.png 2048w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://weekly.elfitz.com/content/images/2023/04/midjourney-fun-2.png" width="2000" height="1121" loading="lazy" alt="Going further with ChatGPT" srcset="https://weekly.elfitz.com/content/images/size/w600/2023/04/midjourney-fun-2.png 600w, https://weekly.elfitz.com/content/images/size/w1000/2023/04/midjourney-fun-2.png 1000w, https://weekly.elfitz.com/content/images/size/w1600/2023/04/midjourney-fun-2.png 1600w, https://weekly.elfitz.com/content/images/size/w2400/2023/04/midjourney-fun-2.png 2400w" sizes="(min-width: 720px) 720px"></div></div></div><figcaption>I sincerely can&apos;t well wether Midjourney&apos;s interpretation of the word &quot;fun&quot; is cute or just plain disturbing</figcaption></figure>]]></content:encoded></item><item><title><![CDATA[Conditionally debounce value updates, in Swift]]></title><description><![CDATA[Improve your iOS app's performance and enhance your users' experience by learning how to conditionally debounce events.

Also, it's a panacea that solves everything. Ok, almost everything. Depending on how you look at it.]]></description><link>https://weekly.elfitz.com/2023/04/16/conditionally-debounce-value-updates/</link><guid isPermaLink="false">63d3bcf601970d004d027523</guid><category><![CDATA[Swift Programming]]></category><category><![CDATA[iOS Development]]></category><category><![CDATA[Mobile Development]]></category><category><![CDATA[Programming]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Sun, 16 Apr 2023 17:56:56 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1535480436112-07697fcbcbea?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDF8fGJvdW5jaW5nJTIwYmFsbHxlbnwwfHx8fDE2ODE2NjY5ODA&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1535480436112-07697fcbcbea?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDF8fGJvdW5jaW5nJTIwYmFsbHxlbnwwfHx8fDE2ODE2NjY5ODA&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" alt="Conditionally debounce value updates, in Swift"><p>Recently, I&apos;ve been working on an app that uses a camera. The feature I was on involved detecting certain features (calibration markers) in real-time and making sure the users could only take a picture if and when <a href="https://opencv.org/?ref=weekly.elfitz.com">OpenCV</a> detected those features. Meaning that the camera shutter button would only be enabled if and when <a href="https://opencv.org/?ref=weekly.elfitz.com">OpenCV</a> had detected the desired features. </p><p>However, while remarkably performant and more than capable of keeping up with the camera&apos;s frame rate, <a href="https://opencv.org/?ref=weekly.elfitz.com">OpenCV</a> would often unpredictably briefly fail to detect the desired features, meaning actually taking a picture pretty much involved racing against <a href="https://opencv.org/?ref=weekly.elfitz.com">OpenCV</a>. Good luck with that.</p><p>To make it easier for our users, and to make sure our these mere mortals wouldn&apos;t have to become semi-professional StarCraft players just to use our UI, I decided to give them an edge, by buffering the last valid frame for a few tenths of a second. Which meant implementing some sort of conditional debounce.</p><h2 id="what-is-a-debounce">What is a debounce?</h2><p>Debouncing is a technique used to optimize the performance of functions that are frequently triggered by user interactions or other events, such as scrolling, resizing, or typing.</p><p>When a debounced function is invoked, it starts a timer and waits for a specified period (debounce time) without any further calls to the function. If the function is called again within the debounce time, the timer is reset, and the waiting period starts again. The function is only executed once the debounce time has passed without any additional calls. This is particularly useful when dealing with events that generate a high number of rapid triggers, such as... text input. Or detecting feature in real-time in camera frames.</p><p>By grouping multiple successive calls into a single call and executes the function after a period of inactivity, it helps limit the number of times a function is executed.</p><h3 id="example-search-as-you-type">Example: Search as you type</h3><p>Let&apos;s take an example: search as you type.</p><p>We&apos;ve all seen it. And when it&apos;s well done, we&apos;ve all love it.</p><p>When implementing search as you type, you may want to avoid sending a request to your backend at <strong>&#x2022; every &#x2022; single &#x2022; user &#x2022; &#xA0;keystroke</strong>. Because that would most likely involve sending many requests, cancelling them every time the user presses a new key, and making sure your concurrent requests don&apos;t generate race conditions (after all, why <strong><em>shouldn&apos;t</em></strong> the first request return <em>after</em> the second one?), or slow your UI to a crawl. By the way, if someone involved with the World of Hyatt iOS app ever reads this, yes, I would be more than willing to fix both of these issues for you. Seriously, are you guys running all those network calls on the main thread?</p><p>Debouncing allows us to group these keystrokes into a single string, and wait until the user stops continuously typing away before sending our query to our backend. Where would have had multiple requests to handle and cancel, we only have one, only sent once the user has slowed down.</p><h2 id="debouncing-conditionally">Debouncing, <em>conditionally</em></h2><p>In my case, the first step is to detect the features on a new frame. When these features are successfully identified, they are compared with the previously stored result. If any differences are detected between the new and old results, the stored result is updated with the new values, and the camera shutter button is enabled. Consequently, pressing the shutter button will use these updated results. However, if the features cannot be found, the system waits for 0.3 seconds before clearing the stored result and disabling the camera shutter button.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://weekly.elfitz.com/content/images/2023/04/debounce-decision-tree.png" class="kg-image" alt="Conditionally debounce value updates, in Swift" loading="lazy" width="991" height="444" srcset="https://weekly.elfitz.com/content/images/size/w600/2023/04/debounce-decision-tree.png 600w, https://weekly.elfitz.com/content/images/2023/04/debounce-decision-tree.png 991w" sizes="(min-width: 720px) 720px"><figcaption>Conditional Debouncing decision tree, courtesy of <a href="https://openai.com/research/gpt-4?ref=weekly.elfitz.com">GPT-4</a> &amp; <a href="https://mermaid.js.org/?ref=weekly.elfitz.com">Mermaid.js</a>. But more on that in a later post.</figcaption></figure><h3 id="implementation">Implementation</h3><p>The following is a generic solution. It takes three parameters: a time interval (<code>delay</code>), an action closure that takes a value of type T, and a <code>shouldDebounce</code> closure that takes a value of type T and returns a boolean value.</p><p>The function returns a closure that accepts a value of type T and performs the specified action either immediately or after a delay, depending on the result of the <code>shouldDebounce</code> closure.</p><pre><code class="language-swift">func conditionalDebounce&lt;T: Equatable&gt;(delay: TimeInterval, action: @escaping (T) -&gt; Void, shouldDebounce: @escaping (T) -&gt; Bool) -&gt; (T) -&gt; Void {
    // Store a reference to the dispatch work item
    var dispatchWorkItem: DispatchWorkItem?

    // Return a closure that takes a value of type T as input
    return { value in
        // Cancel any existing dispatch work item
        dispatchWorkItem?.cancel()

        // Check if the input value should trigger a debounce
        if shouldDebounce(value) {
            // Create a new dispatch work item to perform the action after the specified delay
            let task = DispatchWorkItem {
                DispatchQueue.main.async {
                    action(value)
                }
            }
            // Update the reference to the dispatch work item
            dispatchWorkItem = task
            // Schedule the dispatch work item to run on the main queue after the delay
            DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + delay, execute: task)
        } else {
            // If debounce is not needed, execute the action immediately
            action(value)
        }
    }
}</code></pre><p>Below, a couple simple examples.</p><h4 id="examples">Examples</h4><h5 id="debounce-if-nil">Debounce if nil</h5><h6 id="code">Code</h6><pre><code class="language-swift">func conditionalDebounce&lt;T: Equatable&gt;(delay: TimeInterval, action: @escaping (T) -&gt; Void, shouldDebounce: @escaping (T) -&gt; Bool) -&gt; (T) -&gt; Void {
    // Store a reference to the dispatch work item
    var dispatchWorkItem: DispatchWorkItem?

    // Return a closure that takes a value of type T as input
    return { value in
        // Cancel any existing dispatch work item
        dispatchWorkItem?.cancel()

        // Check if the input value should trigger a debounce
        if shouldDebounce(value) {
            // Create a new dispatch work item to perform the action after the specified delay
            let task = DispatchWorkItem {
                DispatchQueue.main.async {
                    action(value)
                }
            }
            // Update the reference to the dispatch work item
            dispatchWorkItem = task
            // Schedule the dispatch work item to run on the main queue after the delay
            DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + delay, execute: task)
        } else {
            // If debounce is not needed, execute the action immediately
            action(value)
        }
    }
}

// Define an example action to perform
func actionToPerform(value: String?)  {
    print(&quot;Performed action for value: \(value ?? &quot;nil&quot;)&quot;)
}

// Define a predicate for debouncing nil values
func debouncePredicate&lt;T&gt;(newValue: T?) -&gt; Bool {
    return newValue == nil
}

// Create a debouncers for nil values
let debouncer = conditionalDebounce(delay: 2, action: actionToPerform(value:), shouldDebounce: debouncePredicate)

// An array of optional String values to test the debouncer
let values: [String?] = [
    &quot;First&quot;,
    nil,
    &quot;Second&quot;,
    &quot;Second&quot;,
    nil,
    nil,
    nil,
    &quot;Third&quot;,
    &quot;Third&quot;,
    &quot;Third&quot;,
    &quot;Fourth&quot;
]

print(&quot;\n========= Debounce On Nil ======================\n&quot;)
for value in values {
    debouncer(value)
}</code></pre><h6 id="output">Output</h6><pre><code class="language-shell">========= Debounce On Nil ======================

Performed action for value: First
Performed action for value: Second
Performed action for value: Second
Performed action for value: Third
Performed action for value: Third
Performed action for value: Third
Performed action for value: Fourth</code></pre><h5 id="debounce-if-duplicate">Debounce if duplicate</h5><h6 id="code-1">Code</h6><pre><code class="language-swift">func conditionalDebounce&lt;T: Equatable&gt;(delay: TimeInterval, action: @escaping (T) -&gt; Void, shouldDebounce: @escaping (T) -&gt; Bool) -&gt; (T) -&gt; Void {
    // Store a reference to the dispatch work item
    var dispatchWorkItem: DispatchWorkItem?

    // Return a closure that takes a value of type T as input
    return { value in
        // Cancel any existing dispatch work item
        dispatchWorkItem?.cancel()

        // Check if the input value should trigger a debounce
        if shouldDebounce(value) {
            // Create a new dispatch work item to perform the action after the specified delay
            let task = DispatchWorkItem {
                DispatchQueue.main.async {
                    action(value)
                }
            }
            // Update the reference to the dispatch work item
            dispatchWorkItem = task
            // Schedule the dispatch work item to run on the main queue after the delay
            DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + delay, execute: task)
        } else {
            // If debounce is not needed, execute the action immediately
            action(value)
        }
    }
}

// Define an example action to perform
func actionToPerform(value: String?)  {
    print(&quot;Performed action for value: \(value ?? &quot;nil&quot;)&quot;)
}

// Define a predicate for debouncing duplicate values
func debouncePredicate&lt;T: Equatable&gt;() -&gt; (T?) -&gt; Bool {
    var previousValue: T?
    return { newValue in
        let result = newValue == previousValue
        previousValue = newValue
        return result
    }
}

// Create a debouncers for duplicate values
let debouncer = conditionalDebounce(delay: 2, action: actionToPerform(value:), shouldDebounce: debouncePredicate())

// An array of optional String values to test the debouncer
let values: [String?] = [
    &quot;First&quot;,
    nil,
    &quot;Second&quot;,
    &quot;Second&quot;,
    nil,
    nil,
    nil,
    &quot;Third&quot;,
    &quot;Third&quot;,
    &quot;Third&quot;,
    &quot;Fourth&quot;
]

print(&quot;\n========= Debounce On Duplicate ================\n&quot;)
for value in values {
    debouncer(value)
}
</code></pre><h6 id="output-1">Output</h6><pre><code class="language-shell">========= Debounce On Duplicate ================

Performed action for value: First
Performed action for value: nil
Performed action for value: Second
Performed action for value: nil
Performed action for value: Third
Performed action for value: Fourth</code></pre><h5 id="debounce-if-nil-or-duplicate">Debounce if nil or duplicate</h5><h6 id="code-2">Code</h6><pre><code class="language-swift">func conditionalDebounce&lt;T: Equatable&gt;(delay: TimeInterval, action: @escaping (T) -&gt; Void, shouldDebounce: @escaping (T) -&gt; Bool) -&gt; (T) -&gt; Void {
    // Store a reference to the dispatch work item
    var dispatchWorkItem: DispatchWorkItem?

    // Return a closure that takes a value of type T as input
    return { value in
        // Cancel any existing dispatch work item
        dispatchWorkItem?.cancel()

        // Check if the input value should trigger a debounce
        if shouldDebounce(value) {
            // Create a new dispatch work item to perform the action after the specified delay
            let task = DispatchWorkItem {
                DispatchQueue.main.async {
                    action(value)
                }
            }
            // Update the reference to the dispatch work item
            dispatchWorkItem = task
            // Schedule the dispatch work item to run on the main queue after the delay
            DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + delay, execute: task)
        } else {
            // If debounce is not needed, execute the action immediately
            action(value)
        }
    }
}

// Define an example action to perform
func actionToPerform(value: String?)  {
    print(&quot;Performed action for value: \(value ?? &quot;nil&quot;)&quot;)
}

// Define a predicate for debouncing both nil and duplicate values
func debouncePredicate&lt;T: Equatable&gt;() -&gt; (T?) -&gt; Bool {
    var previousValue: T?
    return { newValue in
        let result = newValue == nil || newValue == previousValue
        previousValue = newValue
        return result
    }
}
// Create a debouncers for nil values
let debouncer = conditionalDebounce(delay: 2, action: actionToPerform(value:), shouldDebounce: debouncePredicate())

// An array of optional String values to test the debouncer
let values: [String?] = [
    &quot;First&quot;,
    nil,
    &quot;Second&quot;,
    &quot;Second&quot;,
    nil,
    nil,
    nil,
    &quot;Third&quot;,
    &quot;Third&quot;,
    &quot;Third&quot;,
    &quot;Fourth&quot;
]

print(&quot;\n========= Debounce On Nil or Duplicate =========\n&quot;)
for value in values {
    debouncer(value)
}</code></pre><h6 id="output-2">Output</h6><pre><code class="language-shell">========= Debounce On Nil or Duplicate =========

Performed action for value: First
Performed action for value: Second
Performed action for value: Third
Performed action for value: Fourth</code></pre><h2 id="epilogue">Epilogue</h2><p>And there you have it. <s>The definitive cure to all of mankind&apos;s plagues</s>. I hope it is as useful to you as it was to me, and wish you a very pleasant day! See you around!</p>]]></content:encoded></item><item><title><![CDATA[Metaprogramming in Swift, and the issue with Process and relative paths]]></title><description><![CDATA[Find out why your Swift CLI tool isn't capable of dealing with relative paths, and how to solve it!]]></description><link>https://weekly.elfitz.com/2023/03/23/metaprogramming-in-swift-and-the-issue-with-process-and-relative-paths/</link><guid isPermaLink="false">6419b3156f8427003d5f7df6</guid><category><![CDATA[Swift Programming]]></category><category><![CDATA[Programming]]></category><category><![CDATA[Debugging]]></category><category><![CDATA[shell]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Thu, 23 Mar 2023 14:12:47 GMT</pubDate><content:encoded><![CDATA[<p>While fooling around trying to create a metaprogramming tool written in Swift, I noticed a curious issue whenever I asked said tool to run shell scrips using relative paths, such as <code>../../test2</code>.</p><pre><code class="language-shell">~/P/c/t/Project (main) [1]&gt; swift run Project ../../test2/build_and_run.sh
Error: Could not execute file build_and_run.sh Error Domain=NSCocoaErrorDomain Code=4 &quot;The file &#x201C;test2&#x201D; doesn&#x2019;t exist.&quot; UserInfo={NSFilePath=/Users/user/Projects/currentProject/test/Project/test2}</code></pre><p>In this case, we are currently in the <code>~/Projects/currentProject/test/Project/</code> directory, trying to get our cli tool to run <code>build_and_run.sh</code> in the <code>~/Projects/currentProject/test2`</code> directory. And it&apos;s not going well.</p><p>I had an inkling this was due to the use of relative paths, as you can see below:</p><pre><code class="language-swift">if file.filename.hasSuffix(&quot;.sh&quot;) {
            do {
                print(&quot;Executing file: \(file.filename)&quot;)
                let task = Process()
                task.currentDirectoryURL = URL(fileURLWithPath: outputPath)
                task.executableURL = URL(fileURLWithPath: outputPath).absoluteURL.appendingPathComponent(file.filename)
                task.launchPath = task.executableURL?.absoluteURL.path
                task.arguments = []
                let pipe = Pipe()
                task.standardOutput = pipe
                try task.run()
                task.waitUntilExit()
                let data = pipe.fileHandleForReading.readDataToEndOfFile()
                let output = String(data: data, encoding: .utf8)
                print(&quot;Output of \(file.filename): \n\(output ?? &quot;No output&quot;).&quot;)
            } catch (let error) {
                print(&quot;Error: Could not execute file \(file.filename)&quot;, error)
            }
        }</code></pre><p>Printing them, I noticed the paths did indeed not make any sense. It seemed as if the relative part of the URL, <code>../</code> was simply ignored.</p><pre><code class="language-shell">Executing file: build_and_run.sh
Output Path:  file:///Users/user/Projects/currentProject/test/Project/test2/
Current Directory URL:  Optional(&quot;file:////Users/user/Projects/currentProject/test/Project/test2/&quot;)
Executable URL:  Optional(&quot;file:///Users/user/Projects/currentProject/test/Project/test2/build_and_run.sh&quot;)
Launchpath:  Optional(&quot;/Users/user/Projects/currentProject/test/Project/test2/build_and_run.sh&quot;)</code></pre><p>It turns, as kindly explained <a href="https://forums.swift.org/t/process-currentdirectoryurl-strange-behaviour/36968/5?ref=weekly.elfitz.com">by our dear Quinn on the Swift Forums</a>, that&apos;s actually the case:</p><!--kg-card-begin: markdown--><blockquote>
<p>Here&#x2019;s what&#x2019;s going on&#x2026;</p>
<p>The <code>currentDirectoryURL</code> property is a relatively recent addition. Historically, <code>NSTask</code> (and hence <code>Process</code>) only supported <code>currentDirectoryPath</code>.</p>
<p>The new currentDirectoryURL property is a wrapper around the old currentDirectoryPath property. Internally the setter converts the URL to a path and the getter does the reverse.</p>
<p>The conversion process in the setter uses url.standardizedURL.path.</p>
<p>The standardizedURL property is documented to return <em>A copy of the URL with any instances of &quot;..&quot; or &quot;.&quot; removed from its path</em>. The tricky part here is that it looks just at the URL&#x2019;s path. If the URL is relative, it ignores the path in the base URL.</p>
<p>In your case the URL is relative:</p>
<pre><code>print(directoryURL)
// prints: ../usr/ -- file:///Users/ 
</code></pre>
<p>and thus standardized returns unhelpful results:</p>
<pre><code>print(directoryURL.standardized)
usr/ -- file:///Users` 
</code></pre>
<p>You can avoid the problem with a judicious application of absoluteURL.</p>
<pre><code>`process.currentDirectoryURL = directoryURL.absoluteUR` 
</code></pre>
<p>Share and Enjoy</p>
<p>Quinn &#x201C;The Eskimo!&#x201D; @ DTS @ Apple</p>
</blockquote>
<!--kg-card-end: markdown--><p>So, just to be on the safe side, let&apos;s use absolute URLs everywhere, like this:</p><pre><code class="language-swift">if file.filename.hasSuffix(&quot;.sh&quot;) {
            do {
                print(&quot;Executing file: \(file.filename)&quot;)
                let task = Process()
                task.currentDirectoryURL = URL(fileURLWithPath: outputPath).absoluteURL
                task.executableURL = URL(fileURLWithPath: outputPath).absoluteURL.appendingPathComponent(file.filename)
                task.launchPath = task.executableURL?.absoluteURL.path
                task.arguments = []
                let pipe = Pipe()
                task.standardOutput = pipe
                try task.run()
                task.waitUntilExit()
                let data = pipe.fileHandleForReading.readDataToEndOfFile()
                let output = String(data: data, encoding: .utf8)
                print(&quot;Output of \(file.filename): \n\(output ?? &quot;No output&quot;).&quot;)
            } catch (let error) {
                print(&quot;Error: Could not execute file \(file.filename)&quot;, error)
            }
        }</code></pre><p>And lo and behold, it works.</p><pre><code class="language-shell">~/P/c/t/Project (main) [1]&gt; swift run Project ../../test2/build_and_run.sh
Executing file: build_and_run.sh
Building for debugging...
Build complete! (0.13s)
Output of build_and_run.sh:
[0/1] Planning build
Building for debugging...
[1/3] Compiling Project main.swift
[2/3] Emitting module Project
[2/3] Linking Project
Build complete! (2.55s)
Usage: ./Project outputDirectoryPath pathToFile</code></pre><p>Meanwhile, Xcode was as useless as ever</p><figure class="kg-card kg-image-card"><img src="https://weekly.elfitz.com/content/images/2023/03/Screenshot-xcode-process-absolute-url.png" class="kg-image" alt loading="lazy" width="1158" height="124" srcset="https://weekly.elfitz.com/content/images/size/w600/2023/03/Screenshot-xcode-process-absolute-url.png 600w, https://weekly.elfitz.com/content/images/size/w1000/2023/03/Screenshot-xcode-process-absolute-url.png 1000w, https://weekly.elfitz.com/content/images/2023/03/Screenshot-xcode-process-absolute-url.png 1158w" sizes="(min-width: 720px) 720px"></figure><h2 id="epilogue">Epilogue</h2><p>Anyway, if you&apos;ve landed here, it&apos;s probably because you had a similar issue, so I hope it helped, see you around!</p>]]></content:encoded></item><item><title><![CDATA[Convert between image (pixels) and scaleAspectFit UIImageView coordinates like a champ!]]></title><description><![CDATA[When working with images in iOS, it's sometimes necessary to convert between an image's pixel coordinates and the coordinates of the UIImageView that displays the image. Here's how to do it.]]></description><link>https://weekly.elfitz.com/2023/02/02/convert-between-image-pixels-and-uiimageview-coordinates-like-a-champ/</link><guid isPermaLink="false">63a43f73257cb1003d11918f</guid><category><![CDATA[UIKit]]></category><category><![CDATA[Swift Programming]]></category><category><![CDATA[iOS Development]]></category><category><![CDATA[Programming]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Thu, 02 Feb 2023 22:41:35 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1489702932289-406b7782113c?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDh8fG1hcHxlbnwwfHx8fDE2NzM2MDk1MjQ&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1489702932289-406b7782113c?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDh8fG1hcHxlbnwwfHx8fDE2NzM2MDk1MjQ&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" alt="Convert between image (pixels) and scaleAspectFit UIImageView coordinates like a champ!"><p>When working with images in iOS, it&apos;s sometimes necessary to convert between an image&apos;s pixel coordinates and the coordinates of the UIImageView that displays the image. For example, you might want to know the pixel coordinates of a specific point in the image that the user tapped on, or you might want to highlight a specific area of the image by drawing a rectangle in the UIImageView.</p><p>Having a few years of iOS development experience, I thought this would be a somewhat complex matter (ever heard of image scaling? Well, have a look) that I shouldn&apos;t just wing it if I didn&apos;t want it to come bite me in the buttocks later on, and that there had to be some pretty standard way to deal with it. Oh sweet summer child...</p><p>While I did find a library, <a href="https://github.com/paulz/ImageCoordinateSpace?ref=weekly.elfitz.com">it is archived and has been abandonned for two years</a>. And <a href="https://stackoverflow.com/a/17200285/7243001?ref=weekly.elfitz.com">the first StackOverflow answer</a> doesn&apos;t account for differences in ratios between the image and it&apos;s containing <code>UIImageView</code> when using <code>.scaleAspectFit</code> scaling. <a href="https://stackoverflow.com/a/51083032/7243001?ref=weekly.elfitz.com">This self-answer had a nice idea</a>, namely using <code>AVMakeRect</code> to calculate the rect that the image would take up if it were scaled to fit within the UIImageView&apos;s bounds while maintaining its aspect ratio. But it could be improved and made both more generic (and thus reusable), and clearer.</p><p>So here&apos;s what I came up with.</p><h2 id="solution">Solution</h2><p>We can break it down into three steps:</p><ol><li>Figure out the image&apos;s scaling factor</li><li>Figure out the image&apos;s clipping, to calculate the image&apos;s origin x and y offsets</li><li>Using said offsets and scaling factory, determine the point&apos;s pixel coordinates in the image</li></ol><h2 id="implementation">Implementation</h2><h3 id="detect-the-images-scaling-factor">Detect the image&apos;s scaling factor</h3><p>This function calculates the scale factor by comparing the aspect ratios of the image and the UIImageView. It uses the AVMakeRect function to calculate the rect that the image would take up if it were scaled to fit within the UIImageView&apos;s bounds while maintaining its aspect ratio. It then compares the width and height of the image to the width and height of this rect to determine the scale factor.</p><pre><code class="language-swift">func scale(for image: UIImage, in imageView: UIImageView) -&gt; CGFloat {
    let rect = AVMakeRect(aspectRatio: image.size, insideRect: imageView.bounds)
    let scale = (
        x: image.size.width / imageView.bounds.width,
        y: image.size.height / imageView.bounds.height
    )
    return scale.x &gt; scale.y ? scale.x : scale.y
}</code></pre><h3 id="detect-the-images-clipping">Detect the image&apos;s clipping</h3><p>This function calculates the rect that the image is clipped to in the UIImageView.</p><p>It first checks that the UIImageView has an image set, and if not, it returns the bounds of the UIImageView</p><p>It also checks that the content mode of the UIImageView is set to <code>scaleAspectFit</code>, which is the default content mode for UIImageView. If this is not the case, then the image is not being clipped and the function returns the bounds of the UIImageView.</p><p>If the conditions are met, then the function uses the AVMakeRect function to calculate the rect that the image would take up if it were scaled to fit within the UIImageView&apos;s bounds while maintaining its aspect ratio. This rect represents the area of the image that is visible in the UIImageView.</p><pre><code class="language-swift">func contentClippingRect(for imageView: UIImageView) -&gt; CGRect {
    guard let image = imageView.image else { return bounds }
    guard imageView.contentMode == .scaleAspectFit else { return bounds }
    return AVMakeRect(aspectRatio: image.size, insideRect: imageView.bounds)
}</code></pre><h3 id="calculating-the-points-coordinate-in-the-image-in-pixels">Calculating the point&apos;s coordinate in the image, in pixels</h3><p>This function takes in a point in the UIImageView&apos;s coordinate system and a reference to the UIImageView itself, and returns a point in the image&apos;s pixel coordinate system.</p><p>First, it checks that the reference view has an image set, and if not, it returns nil. Next, it uses the <code>contentClippingRect</code> function to get the rect that the image is clipped to in the UIImageView. The <code>scale</code> function is then used to calculate the scale factor that is used to convert the point from the UIImageView&apos;s coordinate system to the image&apos;s pixel coordinate system. The point is then returned with the x and y values multiplied by the scale factor.</p><pre><code class="language-Swift">func coordinatesInImageForPoint(point: CGPoint, from referenceView: UIImageView) -&gt; CGPoint? {
    guard let image = referenceView.image else {
        return nil
    }
    let imageRect = contentClippingRect(for: referenceView)
    let scale = scale(for: image, in: referenceView)
    return CGPoint(
        x: (point.x  - imageRect.origin.x) * scale,
        y: (point.y - imageRect.origin.y) * scale
    )
}</code></pre><h2 id="all-together">All together</h2><pre><code>func coordinatesInImageForPoint(point: CGPoint, from referenceView: UIImageView) -&gt; CGPoint? {
    guard let image = referenceView.image else {
        return nil
    }
    let imageRect = contentClippingRect(for: referenceView)
    let scale = scale(for: image, in: referenceView)
    return CGPoint(
        x: (point.x  - imageRect.origin.x) * scale,
        y: (point.y - imageRect.origin.y) * scale
    )
}

func scale(for image: UIImage, in imageView: UIImageView) -&gt; CGFloat {
    let rect = AVMakeRect(aspectRatio: image.size, insideRect: imageView.bounds)
    let scale = (
        x: image.size.width / imageView.bounds.width,
        y: image.size.height / imageView.bounds.height
    )
    return scale.x &gt; scale.y ? scale.x : scale.y
}

func contentClippingRect(for imageView: UIImageView) -&gt; CGRect {
    guard let image = imageView.image else { return bounds }
    guard imageView.contentMode == .scaleAspectFit else { return bounds }
    return AVMakeRect(aspectRatio: image.size, insideRect: imageView.bounds)
}</code></pre><h2 id="conclusion">Conclusion</h2><p>And that&apos;s it folks! With these three handy functions, you&apos;ll be able to convert between image pixels and UIImageView coordinates like a pro. Which is basically just mapping between different coordinate systems, which should be easy, right? Oh, well, nevermind. I hope it helps, and, as usual, I wish you all a great day!</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://images.unsplash.com/photo-1551491603-7d38b9e605f5?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDExfHxzd2ltbWluZyUyMHBvb2x8ZW58MHx8fHwxNjczNTkxMjk4&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" class="kg-image" alt="Convert between image (pixels) and scaleAspectFit UIImageView coordinates like a champ!" loading="lazy" width="3024" height="4032" srcset="https://images.unsplash.com/photo-1551491603-7d38b9e605f5?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDExfHxzd2ltbWluZyUyMHBvb2x8ZW58MHx8fHwxNjczNTkxMjk4&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=600 600w, https://images.unsplash.com/photo-1551491603-7d38b9e605f5?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDExfHxzd2ltbWluZyUyMHBvb2x8ZW58MHx8fHwxNjczNTkxMjk4&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=1000 1000w, https://images.unsplash.com/photo-1551491603-7d38b9e605f5?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDExfHxzd2ltbWluZyUyMHBvb2x8ZW58MHx8fHwxNjczNTkxMjk4&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=1600 1600w, https://images.unsplash.com/photo-1551491603-7d38b9e605f5?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDExfHxzd2ltbWluZyUyMHBvb2x8ZW58MHx8fHwxNjczNTkxMjk4&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2400 2400w" sizes="(min-width: 720px) 720px"><figcaption>The swimming pool I&apos;m not in currently, because instead of enjoying life I&apos;m writing this, and working, and there&apos;s too much sun now anyway... wait. What am I complaining about? - Photo by <a href="https://unsplash.com/@jay_solomon?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Jay Solomon</a> / <a href="https://unsplash.com/?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Unsplash</a></figcaption></figure>]]></content:encoded></item><item><title><![CDATA[Making a zoomable Image, in SwiftUI]]></title><description><![CDATA[This should have been simple, but it wasn't. So if you too are having a hard time making  a zoomable image in SwiftUI, come on in!]]></description><link>https://weekly.elfitz.com/2023/01/18/making-a-zoomable-image-in-swiftui/</link><guid isPermaLink="false">6319e77ae3eda2003dcbc3c6</guid><category><![CDATA[SwiftUI]]></category><category><![CDATA[Swift Programming]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Wed, 18 Jan 2023 11:11:09 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1656257324829-d841f8a9fa57?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDExfHxzcHlnbGFzc3xlbnwwfHx8fDE2NzM2MDc1Njc&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1656257324829-d841f8a9fa57?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDExfHxzcHlnbGFzc3xlbnwwfHx8fDE2NzM2MDc1Njc&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" alt="Making a zoomable Image, in SwiftUI"><p>Welcome to this tutorial on creating a zoomable image in SwiftUI! If you&apos;ve been struggling with various options for creating a zoomable image in SwiftUI, you&apos;re not alone.</p><p>While there are many solutions, <a href="https://stackoverflow.com/questions/69669905/cannot-pan-image-in-zoomed-scrollview?ref=weekly.elfitz.com">some require tons of boilerplate</a>, and others only <a href="https://stackoverflow.com/questions/58341820/isnt-there-an-easy-way-to-pinch-to-zoom-in-an-image-in-swiftui?ref=weekly.elfitz.com">work in UIKit</a>. Some, finaly, simply work, until they don&apos;t run on UIKit. <a href="https://stackoverflow.com/questions/58341820/isnt-there-an-easy-way-to-pinch-to-zoom-in-an-image-in-swiftui?ref=weekly.elfitz.com">Others work, until they don&apos;t</a>.<br><br>Having tried those, and looked into a few libraries (I especially liked <a href="https://github.com/Jake-Short/swiftui-image-viewer?ref=weekly.elfitz.com">this one</a>, until I realized it would only stay zoomed in as long as you were actually pinching) , I can say <a href="https://stackoverflow.com/a/67577296/7243001?ref=weekly.elfitz.com">JarWarren</a> and <a href="https://stackoverflow.com/a/71980750/7243001?ref=weekly.elfitz.com">Haolong</a> were right. (Un)surprisingly, turning our image into a PDF document and displaying it using <code>PDFKit</code> <em>is</em> the simplest way I&apos;ve found so far to make a zoomable <code>Image</code> you can pan around in SwiftUI (using <code>UIViewRepresentable</code>).</p><p>For those of you struggling with it, here&apos;s my take on it (careful, <a href="https://stackoverflow.com/questions/62515219/how-to-make-pdfview-zoom-100-to-fit-screen-size?ref=weekly.elfitz.com">it seems PDFKit</a> has <a href="https://stackoverflow.com/questions/52863925/ios-pdfkit-not-filling-uiview?ref=weekly.elfitz.com">it&apos;s own weird quirks</a>):</p><pre><code class="language-Swift">struct ZoomableImage: UIViewRepresentable {

        // used to set the image that will be displayed in the PDFView
        private(set) var image: UIImage
        
        // sets the background color of the PDFView
        private(set) var backgroundColor: Color
        
        // sets the minimum scale factor for zooming out of the image
        private(set) var minScaleFactor: CGFloat
 
        // sets the ideal scale factor for the image when it is first displayed in the PDFView
        // the initial zoom level of the image when it is first displayed
        private(set) var idealScaleFactor: CGFloat
        
        // sets the maximum scale factor for zooming in on the image
        private(set) var maxScaleFactor: CGFloat

        public init(
            image: UIImage,
            backgroundColor: Color,
            minScaleFactor: CGFloat,
            idealScaleFactor: CGFloat,
            maxScaleFactor: CGFloat
        ) {
            self.image = image
            self.backgroundColor = backgroundColor
            self.minScaleFactor = minScaleFactor
            self.idealScaleFactor = idealScaleFactor
            self.maxScaleFactor = maxScaleFactor
        }

        public func makeUIView(context: Context) -&gt; PDFView {
            let view = PDFView()
            guard let page = PDFPage(image: image) else { return view }
            let document = PDFDocument()
            document.insert(page, at: 0)

            view.backgroundColor = UIColor(cgColor: backgroundColor.cgColor!)

            view.autoScales = true
            view.document = document

            view.maxScaleFactor = maxScaleFactor
            view.minScaleFactor = minScaleFactor
            view.scaleFactor = idealScaleFactor
            return view
        }

        public func updateUIView(_ uiView: PDFView, context: Context) {
            // empty
        }
 }</code></pre><p>It is important to note that the <code>idealScaleFactor</code> should be between the <code>minScaleFactor</code> and <code>maxScaleFactor</code>. If not, the <code>idealScaleFactor</code> will be set to the closest of &#xA0;<code>minScaleFactor</code> or <code>maxScaleFactor</code>.</p><h2 id="conclusion">Conclusion</h2><p>In conclusion, creating a zoomable image in SwiftUI can be a bit tricky, but using <code>PDFKit</code> to turn the image into a PDF document and display it using a <code>PDFView</code> is a ridiculous yet simple and effective solution.Thanks for reading, and have a great day!</p>]]></content:encoded></item><item><title><![CDATA[A few awesome useful and nice to have tools]]></title><description><![CDATA[<h2 id="homebrew"><a href="https://brew.sh/?ref=weekly.elfitz.com">Homebrew</a></h2><p>My go-to tool to install pretty much anything on my Mac, <a href="https://brew.sh/?ref=weekly.elfitz.com">Homebrew</a> is a nearly ubiquitous tool in the Mac-using developer community. Don&apos;t believe me? It can be used to install every single macOS app listed here. And that includes the awesome screensaver listed at the end.</p>]]></description><link>https://weekly.elfitz.com/2023/01/13/a-list-of-awesome-tools/</link><guid isPermaLink="false">5d79885b9db1f80038ee3bc0</guid><category><![CDATA[Tools]]></category><category><![CDATA[macOS]]></category><category><![CDATA[Serverless]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Fri, 13 Jan 2023 11:22:58 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1454694220579-9d6672b1ec2a?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<h2 id="homebrew"><a href="https://brew.sh/?ref=weekly.elfitz.com">Homebrew</a></h2><img src="https://images.unsplash.com/photo-1454694220579-9d6672b1ec2a?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" alt="A few awesome useful and nice to have tools"><p>My go-to tool to install pretty much anything on my Mac, <a href="https://brew.sh/?ref=weekly.elfitz.com">Homebrew</a> is a nearly ubiquitous tool in the Mac-using developer community. Don&apos;t believe me? It can be used to install every single macOS app listed here. And that includes the awesome screensaver listed at the end.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://brew.sh/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Homebrew</div><div class="kg-bookmark-description">The Missing Package Manager for macOS (or Linux).</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://brew.sh/assets/img/apple-touch-icon.png" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">Homebrew</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://brew.sh/assets/img/homebrew-social-card.png" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="tyke"><a href="http://tyke.app/?ref=weekly.elfitz.com">Tyke</a></h2><p>I <em>love</em> <a href="http://tyke.app/?ref=weekly.elfitz.com">Tyke</a>. Tyke is simple. Tyke is efficient. Tyke does just one thing and does incredibly well. Tyke is... awesome. With time, I&apos;ve even learnt to appreciate what I used to consider it&apos;s one and only limitation; the fact that it doesn&apos;t keep your text in-memory after a shutdown or a reboot. How? It prevents me from transforming it into an endless clipboard history like I&apos;ve done with TextEdit&apos;s untitled and unsaved files (where do those go, by the way?).</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="http://tyke.app/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Tyke.app</div><div class="kg-bookmark-description">Tyke - a handy text box that lives in your macOS menu bar.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="http://tyke.app/favicon.png" alt="A few awesome useful and nice to have tools"></div></div><div class="kg-bookmark-thumbnail"><img src="http://tyke.app/images/tyke1024.png" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="littlesnitch"><a href="https://www.obdev.at/products/littlesnitch?ref=weekly.elfitz.com">LittleSnitch</a></h2><p>LittleSnitch is a versatile macOS firewall. I previously wrote <a href="https://weekly.elfitz.com/2019/02/12/block-ads-and-trackers-on-your-mac-with-little-snitch/">a post on how to use it block both ads and trackers on macOS</a>, but this is only part of what it can do. Aside from it&apos;s beautiful connections map and the surprising things it sometimes reveals, LittleSnitch can also be used to block any app&apos;s Internet access, block trackers within emails, // FIXME: find more stuff, ...</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://www.obdev.at/products/littlesnitch?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Little Snitch 4</div><div class="kg-bookmark-description">Protects your privacy and prevents your private data from being sent out to the Internet without your knowledge.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://www.obdev.at/apple-touch-icon.png" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">Objective Development</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://www.obdev.at/Images/social-graphs/opengraph-littlesnitch.jpg" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="tripmode"><a href="https://www.tripmode.ch/?ref=weekly.elfitz.com">TripMode</a></h2><p>This one is great if you often use your phone&apos;s tethering or hotspot feature. </p><p>TripMode allows you to restrict which app can or cannot access the internet while using your cellphone&apos;s data plan, preventing the App Store or Steam from burning through it all by downloading some massive updates in a few minutes over 4G. It also helps making tethering a viable option when 3G is the only option available, by preserving bandwith for the app&apos;s you&apos;re actually using.</p><p>Last but not least, you can also enable it on any network, such as a crappy hotel&apos;s free WiFi or an airport&apos;s 500MB capped hotspot, and make the most of your limited Internet connection.</p><p>Combine it with <a href="#littlesnitch">LittleSnitch</a>, and you&apos;ll save even more bandwith by not loading all these trackers and ads while browsing!</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://tripmode.ch/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">TripMode - Save data, browse faster</div><div class="kg-bookmark-description">Easily control your Mac&#x2019;s data usage on slow or expensive networks. Stop wasting money on limited data plans.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://ucarecdn.com/ac25774f-4bc7-4010-a215-e1ff67e92718/" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">Tripmode3 logo and text b</span><span class="kg-bookmark-publisher">J. D. Biersdorfer</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://ucarecdn.com/03fc3e66-5c7c-438a-bd7b-ece74df99591/image.png" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="boom-3d"><a href="https://www.globaldelight.com/boom/?ref=weekly.elfitz.com">Boom 3D</a></h2><p>A nice way to improve your Mac&apos;s sound. It&apos;s been a bit buggy lately, but I still like it very much.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://www.globaldelight.com/boom/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Boom3D, best Volume booster &amp; equalizer for Mac and Windows | Feel the Bass</div><div class="kg-bookmark-description">Experience your audio in 3D. With the best volume booster, equalizer &amp; bass booster, experience your Movies, Music and Games in cinematic surround sound on your laptop, desktop or mobile.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://www.globaldelight.com/boom/favicon.ico" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">Feel the Bass</span></div></div></a></figure><h2 id="paw-now-rapidapi-"><a href="https://paw.cloud/?ref=weekly.elfitz.com">Paw</a> (now RapidAPI)</h2><p>Some will rather use Postman. Personnaly, I love Paw for it&apos;s extensions, it&apos;s ability to generate code in various languages from requests, the way it formats outputs, and it&apos;s UI. It&apos;s just dead simple to use. Somehow, it took me so long to get this post out that they managed to get bought and change name in the meantime.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://paw.cloud/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Paw &#x2013; The most advanced API tool for Mac</div><div class="kg-bookmark-description">Paw is a full-featured HTTP client that lets you test and describe the APIs you build or consume. It has a beautiful native macOS interface to compose requests, inspect server responses, generate client code and export API definitions.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://cdn-static.paw.cloud/img/favicons/android-chrome-192x192-55f3396071.png" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">Paw &#x2013; The most advanced API tool for Mac</span><span class="kg-bookmark-publisher">Paw Inc.</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://cdn-static.paw.cloud/img/og/paw-home-twitter-card-text-296e1c259c.png" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="visual-studio-code-vs-code-"><a href="https://code.visualstudio.com/?ref=weekly.elfitz.com">Visual Studio Code</a> (VS Code)</h2><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://code.visualstudio.com/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Visual Studio Code - Code Editing. Redefined</div><div class="kg-bookmark-description">Visual Studio Code is a code editor redefined and optimized for building and debugging modern web and cloud applications. Visual Studio Code is free and available on your favorite platform - Linux, macOS, and Windows.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://code.visualstudio.com/favicon.ico" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">Visual Studio Code</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://code.visualstudio.com/opengraphimg/opengraph-home.png" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="serverless-framework"><a href="https://www.serverless.com/?ref=weekly.elfitz.com">Serverless Framework</a></h2><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://serverless.com/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Serverless - The Serverless Application Framework powered by AWS Lambda, API Gateway, and more</div><div class="kg-bookmark-description">Build web, mobile and IoT applications using AWS Lambda and API Gateway, Azure Functions, Google Cloud Functions, and more.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://serverless.com/favicons/apple-touch-icon-180x180.png" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">serverless</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://s3-us-west-2.amazonaws.com/assets.site.serverless.com/logos/Serverless_mark_black_400x400_v3%402x.jpg" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="aws-cdk"><a href="https://aws.amazon.com/cdk/?ref=weekly.elfitz.com">AWS CDK</a></h2><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://aws.amazon.com/fr/cdk/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">AWS Cloud Development Kit &#x2013; Amazon Web Services</div><div class="kg-bookmark-description">AWS Cloud Development Kit (CDK) est un framework de d&#xE9;veloppement logiciel permettant de mod&#xE9;liser et d&#x2019;allouer des ressources pour vos applications cloud &#xE0; l&#x2019;aide de langages de programmation courants.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://a0.awsstatic.com/libra-css/images/site/touch-icon-ipad-144-smile.png" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">Amazon Web Services, Inc.</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://a0.awsstatic.com/libra-css/images/logos/aws_logo_smile_1200x630.png" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="veeer"><a href="http://veeer.io/?ref=weekly.elfitz.com">VEEER</a></h2><p>A neat window manager that allow you to simply and quickly move your windows around using keyboard shortcuts.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://veeer.io/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">VEEER Window Manager for Mac OS</div><div class="kg-bookmark-description">VEEER was built with maximizing your workflow speed in mind - which makes it an essential designer &amp; developer productivity tool.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://veeer.io/favicon.ico" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">VEEER</span></div></div><div class="kg-bookmark-thumbnail"><img src="http://veeer.io/wp-content/uploads/2018/02/facebook_preview.png" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="hemingway-editor"><a href="http://www.hemingwayapp.com/?ref=weekly.elfitz.com">Hemingway Editor</a></h2><p>An overpriced writing guide, that seems not to have been updated in quite some time. Still, a nice tool to help improve your writing, when you care to.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://hemingwayapp.com/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Hemingway Editor</div><div class="kg-bookmark-description"></div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://hemingwayapp.com/img/favicon/mstile-310x310.png" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">Hemingway Editor</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://hemingwayapp.com/img/just-released.png" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="effortless"><a href="https://www.effortless.app/?ref=weekly.elfitz.com">Effortless</a></h2><p>&quot;Stay focused&quot;. It will probably take me a lifetime to learn how to do that. Still, probably one of my favorite task managers, probably because it lacks so many things that I can&apos;t get distracted trying to procrastinate by &quot;learning how to best use this awesome tool that will certainly 10x my productivity&quot;. I&apos;ve just described <a href="https://chat.openai.com/chat?ref=weekly.elfitz.com">ChatGPT</a>, haven&apos;t I?</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://www.effortless.app/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Effortless</div><div class="kg-bookmark-description">Stay Focused.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://www.effortless.app/favicon.ico" alt="A few awesome useful and nice to have tools"></div></div></a></figure><h2 id="air-buddy"><a href="https://v2.airbuddy.app/?ref=weekly.elfitz.com">Air Buddy</a></h2><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://gumroad.com/l/airbuddy?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">AirBuddy</div><div class="kg-bookmark-description">Introducing AirBuddy. AirBuddy brings the same AirPods experience you have on iOS to the Mac. With AirBuddy, you can open up your AirPods case next to your Mac and see the status right away, just like it is on your iPhone or iPad. A simple click and you&#x2019;re connected and playing your Mac&#x2019;s audio to A&#x2026;</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://gumroad.com/favicon.ico" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">Gumroad</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://static-2.gumroad.com/res/gumroad/6511982560514/asset_previews/6e3eb62410d674979db11d0ed16c2dad/retina/Header.png" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="nsdateformatter-com"><a href="https://nsdateformatter.com/?ref=weekly.elfitz.com">NSDateFormatter.com</a></h2><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://nsdateformatter.com/?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">NSDateFormatter.com - Live Date Formatting Playground for Swift</div><div class="kg-bookmark-description">An interactive playground and reference for formatting and parsing dates with DateFormatter using Swift or Objective-C.</div><div class="kg-bookmark-metadata"><span class="kg-bookmark-author">Live Date Formatting Playground for Swift</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://nsdateformatter.com/images/swift-logo.png" alt="A few awesome useful and nice to have tools"></div></a></figure><h2 id="setapp"><a href="https://setapp.com/?ref=weekly.elfitz.com">SetApp</a></h2><p>I didn&apos;t get it, but now it&apos;s just an obvious choice.</p><h2 id="aerial"><a href="https://github.com/JohnCoates/Aerial?ref=weekly.elfitz.com">Aerial</a></h2><p>If you still work at the office and want colleagues to stare at your monitor while you&apos;re away each time they come by on their way to the coffee machine, this is for you! Personnaly, I just love seeing these beautiful scenes and making plans to see them in person, with my own two eyes.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/JohnCoates/Aerial?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">JohnCoates/Aerial</div><div class="kg-bookmark-description">Apple TV Aerial Screensaver for Mac. Contribute to JohnCoates/Aerial development by creating an account on GitHub.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://github.githubassets.com/favicon.ico" alt="A few awesome useful and nice to have tools"><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">JohnCoates</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://avatars1.githubusercontent.com/u/967800?s=400&amp;v=4" alt="A few awesome useful and nice to have tools"></div></a></figure>]]></content:encoded></item><item><title><![CDATA[Preserve your images' transparency when saving and sharing them, in Swift]]></title><description><![CDATA[Loosing your rounded corners or transparent background when saving an image to the user's photo library or sharing it with UIActivityViewController? It's because it's be turned into a JPEG. Here's how to solve that.]]></description><link>https://weekly.elfitz.com/2022/12/05/save-share-your-png-images-in-swift/</link><guid isPermaLink="false">631afb67e3eda2003dcbc405</guid><category><![CDATA[Swift Programming]]></category><category><![CDATA[iOS Development]]></category><category><![CDATA[Mobile Development]]></category><category><![CDATA[Programming]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Mon, 05 Dec 2022 06:16:45 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1512099053734-e6767b535838?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDN8fHRyYW5zcGFyZW5jeXxlbnwwfHx8fDE2NjI3MTI2OTA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1512099053734-e6767b535838?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDN8fHRyYW5zcGFyZW5jeXxlbnwwfHx8fDE2NjI3MTI2OTA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Preserve your images&apos; transparency when saving and sharing them, in Swift"><p>Let&apos;s say we are making an app. A chat app, a social media where the main medium are pictures, an image generation app using <a href="https://weekly.elfitz.com/2022/09/02/everybodys-doing-it-a-stable-diffusion-test-post/">the latest GAN models machine learning has to offer</a>, whatever. It&apos;s an iOS app, written in Swift, it shows images, and users can save those locally and re-share them through other apps, without losing the alpha (the transparency).<br><br>How do we preserve said alpha? We save those images as PNG images instead of JPEG.</p><p>How do we save an UIImage as a PNG image in Swift? Easy:</p><pre><code class="language-Swift">func pngImage(image: UIImage) -&gt; UIImage? {
        guard let data = image.pngData(),
              let pngImage = UIImage(data: data) else {
            return nil
        }
        return pngImage
    }
    
func savePngImage(image: UIImage) {
	guard let pngImage = pngImage(image) else {
    	return
    }
    UIImageWriteToSavedPhotosAlbum(pngImage, nil, nil, nil)
}</code></pre><p>Now, how do we share a PNG image in Swift? Even easier:</p><pre><code class="language-Swift">func sharePngImage(image: UIImage) {
    let activityItems = [image.pngData()]
    let activityVC = UIActivityViewController(activityItems: activityItems, applicationActivities: nil)
    UIApplication.shared.windows.first?.rootViewController?.present(activityVC, animated: true, completion: nil)
}</code></pre><p>Yes, that&apos;s it. No long story, no contrived explanations. Just what you (and future me) need! I hope it helped, and wish you all a great day!</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://weekly.elfitz.com/content/images/2022/12/pina-colada-mj.jpg" class="kg-image" alt="Preserve your images&apos; transparency when saving and sharing them, in Swift" loading="lazy" width="1536" height="1024" srcset="https://weekly.elfitz.com/content/images/size/w600/2022/12/pina-colada-mj.jpg 600w, https://weekly.elfitz.com/content/images/size/w1000/2022/12/pina-colada-mj.jpg 1000w, https://weekly.elfitz.com/content/images/2022/12/pina-colada-mj.jpg 1536w" sizes="(min-width: 720px) 720px"><figcaption>A pina colada by the beach, as imagined by a drunken AI. In jpg, because eating up 2MB of bandwidth for that would be plain rude.&#xA0;</figcaption></figure>]]></content:encoded></item><item><title><![CDATA[Everybody's doing it (a Stable Diffusion test post)]]></title><description><![CDATA[<p>A few years back, in February 2019, I, like quite a handful of people, heard for the first time of <a href="https://openai.com/blog/better-language-models/?ref=weekly.elfitz.com">OpenAI&apos;s GPT-2</a>. And while it sure was impressive, it only gave us a glimpse of what GPT-3 would be able to do a year later. Impressive wouldn&apos;</p>]]></description><link>https://weekly.elfitz.com/2022/09/02/everybodys-doing-it-a-stable-diffusion-test-post/</link><guid isPermaLink="false">63124ec7fd9465003d15d1ca</guid><category><![CDATA[Stable Diffusion]]></category><category><![CDATA[Machine Learning]]></category><category><![CDATA[AI]]></category><category><![CDATA[Deep Learning]]></category><category><![CDATA[AWS]]></category><category><![CDATA[Tailscale]]></category><category><![CDATA[Thoughts]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Fri, 02 Sep 2022 19:26:39 GMT</pubDate><media:content url="https://weekly.elfitz.com/content/images/2022/09/00003-50_DDIM_356472175-gfpgan-esrgan4x.png" medium="image"/><content:encoded><![CDATA[<img src="https://weekly.elfitz.com/content/images/2022/09/00003-50_DDIM_356472175-gfpgan-esrgan4x.png" alt="Everybody&apos;s doing it (a Stable Diffusion test post)"><p>A few years back, in February 2019, I, like quite a handful of people, heard for the first time of <a href="https://openai.com/blog/better-language-models/?ref=weekly.elfitz.com">OpenAI&apos;s GPT-2</a>. And while it sure was impressive, it only gave us a glimpse of what GPT-3 would be able to do a year later. Impressive wouldn&apos;t start to describe what some have accomplished with it.</p><p>But that feeling of awe turned out to be nothing compared to seeing what <a href="https://openai.com/dall-e-2/?ref=weekly.elfitz.com">DALL&#xB7;E 2</a>, <a href="https://imagen.research.google/?ref=weekly.elfitz.com">Google&apos;s Imagen</a> and <a href="https://www.midjourney.com/home/?ref=weekly.elfitz.com">Midjourney</a> are capable of.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://weekly.elfitz.com/content/images/2022/09/a-photo-of-a-raccoon-wearing-an-astronaut-helmet.jpg" class="kg-image" alt="Everybody&apos;s doing it (a Stable Diffusion test post)" loading="lazy" width="1024" height="1024" srcset="https://weekly.elfitz.com/content/images/size/w600/2022/09/a-photo-of-a-raccoon-wearing-an-astronaut-helmet.jpg 600w, https://weekly.elfitz.com/content/images/size/w1000/2022/09/a-photo-of-a-raccoon-wearing-an-astronaut-helmet.jpg 1000w, https://weekly.elfitz.com/content/images/2022/09/a-photo-of-a-raccoon-wearing-an-astronaut-helmet.jpg 1024w" sizes="(min-width: 720px) 720px"><figcaption>My personal favorite: <em>&quot;A photo of a raccoon wearing an astronaut helmet, looking out of the window at night&quot;</em> - &#xA9;Google Imagen, I guess? - From the <a href="https://imagen.research.google/?ref=weekly.elfitz.com">Imagen website</a></figcaption></figure><p>But I&apos;m still waiting on my DALL&#xB7;E 2 beta access, Imagen is as tightly closed as it could, and there seems to be no plan to release Midjourney or a public API.</p><p>Now, saying <a href="https://github.com/CompVis/stable-diffusion?ref=weekly.elfitz.com">Stable Diffusion</a> is only great because it&apos;s <em>freely </em>available <em>now</em> would be both a reductive and a simplistic statement.</p><h1 id="stable-diffusion">Stable Diffusion</h1><p><a href="https://stability.ai/?ref=weekly.elfitz.com">stability.ai</a>&apos;s Stable Diffusion is <em>great </em>because <em>it is a great model</em> and because <em>it is completely, publicly &amp; freely available</em> one. Meaning pretty much anyone can use it. And anyone can <em>build</em> anything they want with it (within the confines of it&apos;s license &amp; the law, obviosuly).</p><p>In my case, I simply set out to install <a href="https://github.com/hlky/stable-diffusion?ref=weekly.elfitz.com">an open source Web UI someone made</a> for it on an AWS EC2 GPU Instance, using <a href="https://github.com/AbdBarho/stable-diffusion-webui-docker?ref=weekly.elfitz.com">Docker</a>, and securing my and my friends&apos; access to it all using Tailscale.</p><figure class="kg-card kg-gallery-card kg-width-wide kg-card-hascaption"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://weekly.elfitz.com/content/images/2022/09/00003-50_DDIM_356472175-gfpgan-esrgan4x-1.png" width="2000" height="2000" loading="lazy" alt="Everybody&apos;s doing it (a Stable Diffusion test post)" srcset="https://weekly.elfitz.com/content/images/size/w600/2022/09/00003-50_DDIM_356472175-gfpgan-esrgan4x-1.png 600w, https://weekly.elfitz.com/content/images/size/w1000/2022/09/00003-50_DDIM_356472175-gfpgan-esrgan4x-1.png 1000w, https://weekly.elfitz.com/content/images/size/w1600/2022/09/00003-50_DDIM_356472175-gfpgan-esrgan4x-1.png 1600w, https://weekly.elfitz.com/content/images/2022/09/00003-50_DDIM_356472175-gfpgan-esrgan4x-1.png 2048w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://weekly.elfitz.com/content/images/2022/09/00002-50_DDIM_356472175-gfpgan-esrgan4x.png" width="2000" height="2000" loading="lazy" alt="Everybody&apos;s doing it (a Stable Diffusion test post)" srcset="https://weekly.elfitz.com/content/images/size/w600/2022/09/00002-50_DDIM_356472175-gfpgan-esrgan4x.png 600w, https://weekly.elfitz.com/content/images/size/w1000/2022/09/00002-50_DDIM_356472175-gfpgan-esrgan4x.png 1000w, https://weekly.elfitz.com/content/images/size/w1600/2022/09/00002-50_DDIM_356472175-gfpgan-esrgan4x.png 1600w, https://weekly.elfitz.com/content/images/2022/09/00002-50_DDIM_356472175-gfpgan-esrgan4x.png 2048w" sizes="(min-width: 720px) 720px"></div></div><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://weekly.elfitz.com/content/images/2022/09/00001-50_DDIM_356472175-gfpgan-esrgan4x.png" width="2000" height="2000" loading="lazy" alt="Everybody&apos;s doing it (a Stable Diffusion test post)" srcset="https://weekly.elfitz.com/content/images/size/w600/2022/09/00001-50_DDIM_356472175-gfpgan-esrgan4x.png 600w, https://weekly.elfitz.com/content/images/size/w1000/2022/09/00001-50_DDIM_356472175-gfpgan-esrgan4x.png 1000w, https://weekly.elfitz.com/content/images/size/w1600/2022/09/00001-50_DDIM_356472175-gfpgan-esrgan4x.png 1600w, https://weekly.elfitz.com/content/images/2022/09/00001-50_DDIM_356472175-gfpgan-esrgan4x.png 2048w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://weekly.elfitz.com/content/images/2022/09/00000-50_DDIM_356472175-gfpgan-esrgan4x.png" width="2000" height="2000" loading="lazy" alt="Everybody&apos;s doing it (a Stable Diffusion test post)" srcset="https://weekly.elfitz.com/content/images/size/w600/2022/09/00000-50_DDIM_356472175-gfpgan-esrgan4x.png 600w, https://weekly.elfitz.com/content/images/size/w1000/2022/09/00000-50_DDIM_356472175-gfpgan-esrgan4x.png 1000w, https://weekly.elfitz.com/content/images/size/w1600/2022/09/00000-50_DDIM_356472175-gfpgan-esrgan4x.png 1600w, https://weekly.elfitz.com/content/images/2022/09/00000-50_DDIM_356472175-gfpgan-esrgan4x.png 2048w" sizes="(min-width: 720px) 720px"></div></div></div><figcaption><em>a monkey and a computer in space, intricate, cinematic lighting, highly detailed, digital painting, artstation, concept art, smooth, sharp focus, illustration, illustrated by Sophie Anderson, Mark Arian|steampunk|cyberpunk</em> - or the author of this post</figcaption></figure><p>There are already <a href="https://beta.dreamstudio.ai/?ref=weekly.elfitz.com">several SaaS</a>, <a href="https://creator.nightcafe.studio/create/text-to-image?algo=stable&amp;ref=weekly.elfitz.com">web apps</a> &amp; <a href="https://replicate.com/stability-ai/stable-diffusion?ref=weekly.elfitz.com">other online services</a> making it available to the masses.</p><p><a href="https://github.com/AUTOMATIC1111/stable-diffusion-webui?ref=weekly.elfitz.com">Some, on the other hand</a>, have made several <a href="https://github.com/hlky/stable-diffusion?ref=weekly.elfitz.com">Web UIs</a> (with quite interesting features such as prompt matrices &amp; loopbacks), combining [Stable Diffusion] with other upscaling &amp; facial features fixing models.</p><p><a href="https://github.com/AbdBarho/stable-diffusion-webui-docker?ref=weekly.elfitz.com">Others have containerized those</a>.</p><p>And <a href="https://replicate.com/deforum/deforum_stable_diffusion?ref=weekly.elfitz.com">others yet</a> have already managed to combine [Stable Diffusion]&#xA0;with other models <a href="https://replicate.com/andreasjansson/stable-diffusion-animation?ref=weekly.elfitz.com">to generate truly mesmerizing animations</a>.</p><figure class="kg-card kg-gallery-card kg-width-wide kg-card-hascaption"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://weekly.elfitz.com/content/images/2022/09/IMG_3193.PNG" width="512" height="512" loading="lazy" alt="Everybody&apos;s doing it (a Stable Diffusion test post)"></div><div class="kg-gallery-image"><img src="https://weekly.elfitz.com/content/images/2022/09/IMG_3192-2.PNG" width="512" height="512" loading="lazy" alt="Everybody&apos;s doing it (a Stable Diffusion test post)"></div><div class="kg-gallery-image"><img src="https://weekly.elfitz.com/content/images/2022/09/IMG_3189.PNG" width="1080" height="1078" loading="lazy" alt="Everybody&apos;s doing it (a Stable Diffusion test post)" srcset="https://weekly.elfitz.com/content/images/size/w600/2022/09/IMG_3189.PNG 600w, https://weekly.elfitz.com/content/images/size/w1000/2022/09/IMG_3189.PNG 1000w, https://weekly.elfitz.com/content/images/2022/09/IMG_3189.PNG 1080w" sizes="(min-width: 720px) 720px"></div></div></div><figcaption>An oil painting of a poker playing octopus, a mouse-cat &amp; and an alien chair. Guess which one does not exist?</figcaption></figure><p>So if you have a curious mind, like fooling around with new wondrous toys, and figuring out what they can enable you to accomplish, now is the time to join the fray. To have fun generating truly absurd things, figure out this brand new and fascinating technology&apos;s limits, and see what we can build with it.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://weekly.elfitz.com/content/images/2022/09/IMG_3199.PNG" class="kg-image" alt="Everybody&apos;s doing it (a Stable Diffusion test post)" loading="lazy" width="512" height="512"><figcaption>Did someone ask for a commie octopus?</figcaption></figure><p></p>]]></content:encoded></item><item><title><![CDATA[From a base64 API Key to a json file, in bash, on macOS]]></title><description><![CDATA[While working on a client's app's CI pipeline, I needed to use an API key. However, it was stored as a base64 environment variable, when the tool I needed to use required it to be plaintext in a JSON file. Fun times.]]></description><link>https://weekly.elfitz.com/2022/06/28/base64-to-json-in-bash-on-macos/</link><guid isPermaLink="false">62ab69df9e55f1003ddf5a19</guid><category><![CDATA[shell]]></category><category><![CDATA[macOS]]></category><category><![CDATA[Programming]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Tue, 28 Jun 2022 15:00:42 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1488229297570-58520851e868?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDE5fHxjb2RlfGVufDB8fHx8MTY1NTQwMDk0Ng&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<h1 id="context">Context</h1><img src="https://images.unsplash.com/photo-1488229297570-58520851e868?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDE5fHxjb2RlfGVufDB8fHx8MTY1NTQwMDk0Ng&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="From a base64 API Key to a json file, in bash, on macOS"><p>While working on a client&apos;s app&apos;s CI pipeline, I needed to use an API key. It was already used by some of the pipeline&apos;s other stages, and thus available to me as an environment variable. However, it was encoded in base64, when the tool I needed to use required it to be plaintext in a JSON file.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://images.unsplash.com/photo-1496843916299-590492c751f4?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDMxfHxmdW58ZW58MHx8fHwxNjU1NDExNzA0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" class="kg-image" alt="From a base64 API Key to a json file, in bash, on macOS" loading="lazy" width="5777" height="3851" srcset="https://images.unsplash.com/photo-1496843916299-590492c751f4?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDMxfHxmdW58ZW58MHx8fHwxNjU1NDExNzA0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=600 600w, https://images.unsplash.com/photo-1496843916299-590492c751f4?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDMxfHxmdW58ZW58MHx8fHwxNjU1NDExNzA0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1000 1000w, https://images.unsplash.com/photo-1496843916299-590492c751f4?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDMxfHxmdW58ZW58MHx8fHwxNjU1NDExNzA0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1600 1600w, https://images.unsplash.com/photo-1496843916299-590492c751f4?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDMxfHxmdW58ZW58MHx8fHwxNjU1NDExNzA0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2400 2400w" sizes="(min-width: 720px) 720px"><figcaption>Closer to my idea of a fun afternoon - Photo by <a href="https://unsplash.com/@pineapple?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Pineapple Supply Co.</a> / <a href="https://unsplash.com/?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Unsplash</a></figcaption></figure><h1 id="set-up">Set up</h1><p>Let&apos;s start with a simple string from our favorite text generator: <a href="https://slipsum.com/?ref=weekly.elfitz.com">Samuel L Ipsum</a>. Why? Because like Apple&apos;s <a href="https://appstoreconnect.apple.com/?ref=weekly.elfitz.com">Appstore Connect</a> <a href="https://developer.apple.com/documentation/appstoreconnectapi?ref=weekly.elfitz.com">API keys</a>, it contains line breaks.</p><blockquote>My money&apos;s in that office, right? If she start giving me some bullshit about it ain&apos;t there, and we got to go someplace else and get it, I&apos;m gonna shoot you in the head then and there. Then I&apos;m gonna shoot that bitch in the kneecaps, find out where my goddamn money is. She gonna tell me too. Hey, look at me when I&apos;m talking to you, motherfucker. You listen: we go in there, and that nigga Winston or anybody else is in there, you the first motherfucker to get shot. You understand?<br> <br>Normally, both your asses would be dead as fucking fried chicken, but you happen to pull this shit while I&apos;m in a transitional period so I don&apos;t wanna kill you, I wanna help you. But I can&apos;t give you this case, it don&apos;t belong to me. Besides, I&apos;ve already been through too much shit this morning over this case to hand it over to your dumb ass.<br> <br>Your bones don&apos;t break, mine do. That&apos;s clear. Your cells react to bacteria and viruses differently than mine. You don&apos;t get sick, I do. That&apos;s also clear. But for some reason, you and I react the exact same way to water. We swallow it too fast, we choke. We get some in our lungs, we drown. However unreal it may seem, we are connected, you and I. We&apos;re on the same curve, just on opposite ends.<br> <br>Look, just because I don&apos;t be givin&apos; no man a foot massage don&apos;t make it right for Marsellus to throw Antwone into a glass motherfuckin&apos; house, fuckin&apos; up the way the nigger talks. Motherfucker do that shit to me, he better paralyze my ass, &apos;cause I&apos;ll kill the motherfucker, know what I&apos;m sayin&apos;?<br> <br>My money&apos;s in that office, right? If she start giving me some bullshit about it ain&apos;t there, and we got to go someplace else and get it, I&apos;m gonna shoot you in the head then and there. Then I&apos;m gonna shoot that bitch in the kneecaps, find out where my goddamn money is. She gonna tell me too. Hey, look at me when I&apos;m talking to you, motherfucker. You listen: we go in there, and that nigga Winston or anybody else is in there, you the first motherfucker to get shot. You understand?</blockquote><pre><code>OUR_VAR=&quot;My money&apos;s in that office, right? If she start giving me some bullshit about it ain&apos;t there, and we got to go someplace else and get it, I&apos;m gonna shoot you in the head then and there. Then I&apos;m gonna shoot that bitch in the kneecaps, find out where my goddamn money is. She gonna tell me too. Hey, look at me when I&apos;m talking to you, motherfucker. You listen: we go in there, and that nigga Winston or anybody else is in there, you the first motherfucker to get shot. You understand?

Normally, both your asses would be dead as fucking fried chicken, but you happen to pull this shit while I&apos;m in a transitional period so I don&apos;t wanna kill you, I wanna help you. But I can&apos;t give you this case, it don&apos;t belong to me. Besides, I&apos;ve already been through too much shit this morning over this case to hand it over to your dumb ass.

Your bones don&apos;t break, mine do. That&apos;s clear. Your cells react to bacteria and viruses differently than mine. You don&apos;t get sick, I do. That&apos;s also clear. But for some reason, you and I react the exact same way to water. We swallow it too fast, we choke. We get some in our lungs, we drown. However unreal it may seem, we are connected, you and I. We&apos;re on the same curve, just on opposite ends.

Look, just because I don&apos;t be givin&apos; no man a foot massage don&apos;t make it right for Marsellus to throw Antwone into a glass motherfuckin&apos; house, fuckin&apos; up the way the nigger talks. Motherfucker do that shit to me, he better paralyze my ass, &apos;cause I&apos;ll kill the motherfucker, know what I&apos;m sayin&apos;?

My money&apos;s in that office, right? If she start giving me some bullshit about it ain&apos;t there, and we got to go someplace else and get it, I&apos;m gonna shoot you in the head then and there. Then I&apos;m gonna shoot that bitch in the kneecaps, find out where my goddamn money is. She gonna tell me too. Hey, look at me when I&apos;m talking to you, motherfucker. You listen: we go in there, and that nigga Winston or anybody else is in there, you the first motherfucker to get shot. You understand?&quot;

echo &quot;$OUR_VAR&quot;</code></pre><p>Encoding it in <code>base64</code> and storing it in a variable is easy enough:</p><pre><code class="language-bash">BASE64_API_KEY=$(echo $OUR_VAR | base64)</code></pre><h1 id="decoding-a-base64-string">Decoding a <code>base64</code> string</h1><p>Now, let&apos;s decode it and store the decoded string, shall we?</p><pre><code class="language-bash">DECODED_API_KEY=$(echo $BASE64_API_KEY | base64 --decode)
echo $DECODED_API_KEY</code></pre><p>Still quite simple. </p><h1 id="escaping-newlines-line-breaks">Escaping newlines &amp; line breaks</h1><p>Now, since we want to store that key in a <code>.json</code> file, we need to escape our newlines. Easy, right? Not on macOS. Thankfully, there&apos;s a known solution (shameless plug alert): <a href="https://weekly.elfitz.com/2022/06/16/escape-a-newline-in-bash-on-macos/">Escape a newline in bash on macOS</a>.</p><figure class="kg-card kg-code-card"><pre><code>NEWLINES_ESCAPED_API_KEY=$(echo &quot;$DECODED_API_KEY&quot; | awk -v ORS=&apos;\\n&apos; &apos;1&apos;)
printf &quot;\n[NEWLINE_ESCAPE_API_KEY]\n%s\n--------\n&quot; &quot;$NEWLINES_ESCAPED_API_KEY&quot;</code></pre><figcaption>Why printf and not echo? Because we want to actually see our \n characters</figcaption></figure><h1 id="creating-a-new-json-file-with-the-decoded-content">Creating a new <code>JSON</code> file with the decoded content</h1><p>And now, instead of playing around with echo and manually escaping newlines and double quotes, lets simply use <code>cat</code> with End Of File (<code>EOF</code>):</p><pre><code class="language-bash">cat &lt;&lt;EOF &gt; &quot;file.json&quot;
{
  &quot;key&quot;: &quot;$NEWLINES_ESCAPED_API_KEY&quot;
}
EOF</code></pre><h1 id="full-example">Full example</h1><p>Here&apos;s what a full script would look like:</p><pre><code class="language-bash">JSON_FILE_PATH=&quot;file.json&quot;

ACCOUNT_ID=&quot;3dfae1f9-a8a7-4847-a8c0-306b1d2a1016&quot;

BASE64_API_KEY=&quot;TXkgbW9uZXkncyBpbiB0aGF0IG9mZmljZSwgcmlnaHQ/IElmIHNoZSBzdGFydCBnaXZpbmcgbWUgc29tZSBidWxsc2hpdCBhYm91dCBpdCBhaW4ndCB0aGVyZSwgYW5kIHdlIGdvdCB0byBnbyBzb21lcGxhY2UgZWxzZSBhbmQgZ2V0IGl0LCBJJ20gZ29ubmEgc2hvb3QgeW91IGluIHRoZSBoZWFkIHRoZW4gYW5kIHRoZXJlLiBUaGVuIEknbSBnb25uYSBzaG9vdCB0aGF0IGJpdGNoIGluIHRoZSBrbmVlY2FwcywgZmluZCBvdXQgd2hlcmUgbXkgZ29kZGFtbiBtb25leSBpcy4gU2hlIGdvbm5hIHRlbGwgbWUgdG9vLiBIZXksIGxvb2sgYXQgbWUgd2hlbiBJJ20gdGFsa2luZyB0byB5b3UsIG1vdGhlcmZ1Y2tlci4gWW91IGxpc3Rlbjogd2UgZ28gaW4gdGhlcmUsIGFuZCB0aGF0IG5pZ2dhIFdpbnN0b24gb3IgYW55Ym9keSBlbHNlIGlzIGluIHRoZXJlLCB5b3UgdGhlIGZpcnN0IG1vdGhlcmZ1Y2tlciB0byBnZXQgc2hvdC4gWW91IHVuZGVyc3RhbmQ/CgpOb3JtYWxseSwgYm90aCB5b3VyIGFzc2VzIHdvdWxkIGJlIGRlYWQgYXMgZnVja2luZyBmcmllZCBjaGlja2VuLCBidXQgeW91IGhhcHBlbiB0byBwdWxsIHRoaXMgc2hpdCB3aGlsZSBJJ20gaW4gYSB0cmFuc2l0aW9uYWwgcGVyaW9kIHNvIEkgZG9uJ3Qgd2FubmEga2lsbCB5b3UsIEkgd2FubmEgaGVscCB5b3UuIEJ1dCBJIGNhbid0IGdpdmUgeW91IHRoaXMgY2FzZSwgaXQgZG9uJ3QgYmVsb25nIHRvIG1lLiBCZXNpZGVzLCBJJ3ZlIGFscmVhZHkgYmVlbiB0aHJvdWdoIHRvbyBtdWNoIHNoaXQgdGhpcyBtb3JuaW5nIG92ZXIgdGhpcyBjYXNlIHRvIGhhbmQgaXQgb3ZlciB0byB5b3VyIGR1bWIgYXNzLgoKWW91ciBib25lcyBkb24ndCBicmVhaywgbWluZSBkby4gVGhhdCdzIGNsZWFyLiBZb3VyIGNlbGxzIHJlYWN0IHRvIGJhY3RlcmlhIGFuZCB2aXJ1c2VzIGRpZmZlcmVudGx5IHRoYW4gbWluZS4gWW91IGRvbid0IGdldCBzaWNrLCBJIGRvLiBUaGF0J3MgYWxzbyBjbGVhci4gQnV0IGZvciBzb21lIHJlYXNvbiwgeW91IGFuZCBJIHJlYWN0IHRoZSBleGFjdCBzYW1lIHdheSB0byB3YXRlci4gV2Ugc3dhbGxvdyBpdCB0b28gZmFzdCwgd2UgY2hva2UuIFdlIGdldCBzb21lIGluIG91ciBsdW5ncywgd2UgZHJvd24uIEhvd2V2ZXIgdW5yZWFsIGl0IG1heSBzZWVtLCB3ZSBhcmUgY29ubmVjdGVkLCB5b3UgYW5kIEkuIFdlJ3JlIG9uIHRoZSBzYW1lIGN1cnZlLCBqdXN0IG9uIG9wcG9zaXRlIGVuZHMuCgpMb29rLCBqdXN0IGJlY2F1c2UgSSBkb24ndCBiZSBnaXZpbicgbm8gbWFuIGEgZm9vdCBtYXNzYWdlIGRvbid0IG1ha2UgaXQgcmlnaHQgZm9yIE1hcnNlbGx1cyB0byB0aHJvdyBBbnR3b25lIGludG8gYSBnbGFzcyBtb3RoZXJmdWNraW4nIGhvdXNlLCBmdWNraW4nIHVwIHRoZSB3YXkgdGhlIG5pZ2dlciB0YWxrcy4gTW90aGVyZnVja2VyIGRvIHRoYXQgc2hpdCB0byBtZSwgaGUgYmV0dGVyIHBhcmFseXplIG15IGFzcywgJ2NhdXNlIEknbGwga2lsbCB0aGUgbW90aGVyZnVja2VyLCBrbm93IHdoYXQgSSdtIHNheWluJz8KCk15IG1vbmV5J3MgaW4gdGhhdCBvZmZpY2UsIHJpZ2h0PyBJZiBzaGUgc3RhcnQgZ2l2aW5nIG1lIHNvbWUgYnVsbHNoaXQgYWJvdXQgaXQgYWluJ3QgdGhlcmUsIGFuZCB3ZSBnb3QgdG8gZ28gc29tZXBsYWNlIGVsc2UgYW5kIGdldCBpdCwgSSdtIGdvbm5hIHNob290IHlvdSBpbiB0aGUgaGVhZCB0aGVuIGFuZCB0aGVyZS4gVGhlbiBJJ20gZ29ubmEgc2hvb3QgdGhhdCBiaXRjaCBpbiB0aGUga25lZWNhcHMsIGZpbmQgb3V0IHdoZXJlIG15IGdvZGRhbW4gbW9uZXkgaXMuIFNoZSBnb25uYSB0ZWxsIG1lIHRvby4gSGV5LCBsb29rIGF0IG1lIHdoZW4gSSdtIHRhbGtpbmcgdG8geW91LCBtb3RoZXJmdWNrZXIuIFlvdSBsaXN0ZW46IHdlIGdvIGluIHRoZXJlLCBhbmQgdGhhdCBuaWdnYSBXaW5zdG9uIG9yIGFueWJvZHkgZWxzZSBpcyBpbiB0aGVyZSwgeW91IHRoZSBmaXJzdCBtb3RoZXJmdWNrZXIgdG8gZ2V0IHNob3QuIFlvdSB1bmRlcnN0YW5kPw==&quot;
&quot;
printf &quot;\n[ACCOUNT ID]\n%s\n--------\n&quot; &quot;$ACCOUNT_ID&quot;
printf &quot;\n[BASE64 API KEY]\n%s\n--------\n&quot; &quot;$BASE64_API_KEY

DECODED_API_KEY=$(echo $BASE64_API_KEY | base64 --decode)
NEWLINES_ESCAPED_API_KEY=$(echo &quot;$DECODED_API_KEY&quot; | awk -v ORS=&apos;\\n&apos; &apos;1&apos;)

printf &quot;\n[DECODED_API_KEY]\n%s\n--------\n&quot; &quot;$DECODED_API_KEY&quot;
printf &quot;\n[NEWLINE_ESCAPE_API_KEY]\n%s\n--------\n&quot; &quot;$NEWLINES_ESCAPED_API_KEY&quot;

cat &lt;&lt;EOF &gt; $JSON_FILE_PATH
{
  &quot;accountId&quot;: &quot;$ACCOUNT_ID&quot;,
  &quot;key&quot;: &quot;$NEWLINES_ESCAPED_API_KEY&quot;
}
EOF

printf &quot;\n[JSON FILE]\n%s\n--------\n&quot; &quot;$(cat $JSON_FILE_PATH)&quot;</code></pre><h1 id="side-notes">Side notes</h1><p>At some points, while working on my script, when using <code>echo</code> to print variables assigned using &quot;command substitution&quot; (ie <code>DECODED_API_KEY=$(echo $BASE64_API_KEY | base64 &#xA0;--decode)</code>, the line breaks where simply gone and replaced by spaces. , the line breaks would simply be gone</p><p>That&apos;s because to quote <a href="https://unix.stackexchange.com/a/164548?ref=weekly.elfitz.com">cuonglm&apos;s answer on Stack Exchange</a>:</p><blockquote>The newlines were lost, because the shell had performed <a href="http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html?ref=weekly.elfitz.com#tag_18_06_05" rel="noreferrer">field splitting</a> after command substitution.</blockquote><p>We can solve that by adding quotes our variable&apos;s name when printing it, like so:</p><pre><code class="language-bash">echo &quot;$DECODED_API_KEY&quot;</code></pre><h1 id="epilogue">Epilogue</h1><p>As always, should you encounter any issues or have any suggestions for improvements, <a href="https://twitter.com/ElFitz_?ref=weekly.elfitz.com">reach out to me on Twitter!</a> And have a great day! Here&apos;s some nice picture of Edinburgh to brighten your day, fresh off Unsplash:</p><figure class="kg-card kg-image-card kg-width-full kg-card-hascaption"><img src="https://images.unsplash.com/photo-1557335525-380f0bc8be34?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEyfHxlZGluYnVyZ2h8ZW58MHx8fHwxNjU1NDAzNTcy&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" class="kg-image" alt="From a base64 API Key to a json file, in bash, on macOS" loading="lazy" width="4978" height="2601" srcset="https://images.unsplash.com/photo-1557335525-380f0bc8be34?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEyfHxlZGluYnVyZ2h8ZW58MHx8fHwxNjU1NDAzNTcy&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=600 600w, https://images.unsplash.com/photo-1557335525-380f0bc8be34?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEyfHxlZGluYnVyZ2h8ZW58MHx8fHwxNjU1NDAzNTcy&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1000 1000w, https://images.unsplash.com/photo-1557335525-380f0bc8be34?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEyfHxlZGluYnVyZ2h8ZW58MHx8fHwxNjU1NDAzNTcy&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1600 1600w, https://images.unsplash.com/photo-1557335525-380f0bc8be34?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEyfHxlZGluYnVyZ2h8ZW58MHx8fHwxNjU1NDAzNTcy&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2400 2400w"><figcaption>Photo by <a href="https://unsplash.com/@yvesalarie?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Yves Alarie</a> / <a href="https://unsplash.com/?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Unsplash</a></figcaption></figure>]]></content:encoded></item><item><title><![CDATA[We aren’t that remarkable]]></title><description><![CDATA[I've heard quite a few times that we, as a species, are terrible, destroying the Earth, ourselves, and that we are, all in all, the worse.

That's assuming a lot. First and foremost, that we are as smart and great as we think ourselves to be.]]></description><link>https://weekly.elfitz.com/2022/06/21/we-arent-that-remarkable/</link><guid isPermaLink="false">613796ac44179e003b05adb4</guid><category><![CDATA[Thoughts]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Tue, 21 Jun 2022 15:00:13 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1535968881874-0c39f1503640?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEyfHxKdW5nbGV8ZW58MHx8fHwxNjMxMDMzODMw&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1535968881874-0c39f1503640?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEyfHxKdW5nbGV8ZW58MHx8fHwxNjMxMDMzODMw&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="We aren&#x2019;t that remarkable"><p>Something irks me, regarding the discourse surrounding global warming. Some people say humanity is a virus. Others, that we are &quot;the only beings capable of causing a mass extinction event all on our own.&quot;</p><p>I&apos;m sorry, but that is plain wrong.</p><p>Not in that we aren&apos;t causing such a mass extinction event. We are.<br>Not because there is no climate change and global warming. There are.</p><p>But because we are just yet another lifeform wreaking havoc on it&apos;s ecosystem, eradicating a huge number of species of all kinds in the process.</p><h1 id="the-great-oxygenation-event">The Great Oxygenation Event</h1><p>It is suspected that the first Cyanobacteria may have killed most of life when they first appeared.</p><p>How? They basically started producing oxygen through photosynthesis and releasing amounts of it into the atmosphere. Oxygen, being highly reactive, started oxidizing every living thing it came in contact with, killing off most species at the time. It is also suspected of having oxidized large amounts of atmospheric methane into carbon dioxizde, weakening the atmosphere&apos;s greehouse effect and starting a series of ice ages: the <a href="https://en.m.wikipedia.org/wiki/Huronian_glaciation?ref=weekly.elfitz.com">Huronian Glaciation</a>. So there you have it: a bunch of bacteria started producing oxygen because it was convenient to them and thus killed nearly all life on Earth in the process.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://images.unsplash.com/photo-1596051827487-7b3d6f6df842?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDR8fEJhY3RlcmlhfGVufDB8fHx8MTY0NDE5MTAxOQ&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" class="kg-image" alt="We aren&#x2019;t that remarkable" loading="lazy" width="6000" height="4000" srcset="https://images.unsplash.com/photo-1596051827487-7b3d6f6df842?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDR8fEJhY3RlcmlhfGVufDB8fHx8MTY0NDE5MTAxOQ&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=600 600w, https://images.unsplash.com/photo-1596051827487-7b3d6f6df842?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDR8fEJhY3RlcmlhfGVufDB8fHx8MTY0NDE5MTAxOQ&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1000 1000w, https://images.unsplash.com/photo-1596051827487-7b3d6f6df842?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDR8fEJhY3RlcmlhfGVufDB8fHx8MTY0NDE5MTAxOQ&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1600 1600w, https://images.unsplash.com/photo-1596051827487-7b3d6f6df842?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDR8fEJhY3RlcmlhfGVufDB8fHx8MTY0NDE5MTAxOQ&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2400 2400w" sizes="(min-width: 720px) 720px"><figcaption>Photo by <a href="https://unsplash.com/@vollkornapfel?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Adrian Lange</a> / <a href="https://unsplash.com/?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Unsplash</a></figcaption></figure><h1 id="the-devonian-extinction">The Devonian Extinction</h1><p>And there is the very distinct chance plants may have too, at least once, perhaps twice or more, each time in a different way. In the case of the Devionian Extinction, it would be the &quot;<a href="http://www.devoniantimes.org/opportunity/massExtinction.html?ref=weekly.elfitz.com">Devonian Plant Hypothesis</a>&quot;, where plants would have apparently done the exact opposite of what we&apos;re doing: bury massive amounts of CO2, drastically cooling the Earth.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://images.unsplash.com/photo-1542273917363-3b1817f69a2d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDV8fHRyZWV8ZW58MHx8fHwxNjU0MDAzODg0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" class="kg-image" alt="We aren&#x2019;t that remarkable" loading="lazy" width="4288" height="2848" srcset="https://images.unsplash.com/photo-1542273917363-3b1817f69a2d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDV8fHRyZWV8ZW58MHx8fHwxNjU0MDAzODg0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=600 600w, https://images.unsplash.com/photo-1542273917363-3b1817f69a2d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDV8fHRyZWV8ZW58MHx8fHwxNjU0MDAzODg0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1000 1000w, https://images.unsplash.com/photo-1542273917363-3b1817f69a2d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDV8fHRyZWV8ZW58MHx8fHwxNjU0MDAzODg0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1600 1600w, https://images.unsplash.com/photo-1542273917363-3b1817f69a2d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDV8fHRyZWV8ZW58MHx8fHwxNjU0MDAzODg0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2400 2400w" sizes="(min-width: 720px) 720px"><figcaption>Trees, am I right? - Photo by <a href="https://unsplash.com/es/@maritafox?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Marita Kavelashvili</a> / <a href="https://unsplash.com/?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Unsplash</a></figcaption></figure><h1 id="the-permian-triassic-mass-extinction">The Permian-Triassic Mass Extinction</h1><p>This time around, 90% of all life on Earth is supposed to have been wiped off the face of the planet. Including over 95% of all sea species at the time.</p><p>In this case, it apparently might have been caused by an asteroid, volcanic activity, climate change (again), or... microbes. Some methane-producing bacteria&apos;s population may have simply grown explosively across the world (sounds familiar?), releasing massive amounts of methane and thus drastically changing the climate along with the oceans&apos; chemistry. And killing nearly all life on Earth at the time.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://images.unsplash.com/photo-1535127022272-dbe7ee35cf33?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEzfHxiYWN0ZXJpYXxlbnwwfHx8fDE2NTQwMDQyODA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" class="kg-image" alt="We aren&#x2019;t that remarkable" loading="lazy" width="6000" height="4000" srcset="https://images.unsplash.com/photo-1535127022272-dbe7ee35cf33?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEzfHxiYWN0ZXJpYXxlbnwwfHx8fDE2NTQwMDQyODA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=600 600w, https://images.unsplash.com/photo-1535127022272-dbe7ee35cf33?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEzfHxiYWN0ZXJpYXxlbnwwfHx8fDE2NTQwMDQyODA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1000 1000w, https://images.unsplash.com/photo-1535127022272-dbe7ee35cf33?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEzfHxiYWN0ZXJpYXxlbnwwfHx8fDE2NTQwMDQyODA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1600 1600w, https://images.unsplash.com/photo-1535127022272-dbe7ee35cf33?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEzfHxiYWN0ZXJpYXxlbnwwfHx8fDE2NTQwMDQyODA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2400 2400w" sizes="(min-width: 720px) 720px"><figcaption>So tiny, and yet so deadly - Photo by <a href="https://unsplash.com/es/@michael_schiffer_design?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Michael Schiffer</a> / <a href="https://unsplash.com/?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Unsplash</a></figcaption></figure><h1 id="conclusion">Conclusion</h1><p>We&#x2019;re basically just the new kids on the block. Arrogant, full of ourselves, and convinced that <em>we</em> are special. <em>We</em> are different. Even when we make mistakes that have been made over and over again long before any human walked this Earth <em>by living organisms that don&apos;t even have a brain</em>. </p><p>Especially then.</p><p>Because, being so high and mighty, full of confidence in our manifest superiority, <em>we</em> should know better.</p><p>Well, we don&apos;t. Humanity is just a know-it-all overly proud pompous kid, with milk still dripping out it&apos;s nose.</p><p>Which doesn&apos;t make what we&apos;re doing right. It just makes us, as a species, unremarkably average.</p><hr><h1 id="resources">Resources:</h1><ul><li><a href="https://blogs.agu.org/thefield/2018/03/02/plants-cause-one-earths-great-extinctions/?ref=weekly.elfitz.com">https://blogs.agu.org/thefield/2018/03/02/plants-cause-one-earths-great-extinctions/</a></li><li><a href="https://www.pbs.org/video/how-plants-caused-the-first-mass-extinction-ngqncd/?ref=weekly.elfitz.com">https://www.pbs.org/video/how-plants-caused-the-first-mass-extinction-ngqncd/</a></li><li><a href="https://www.thoughtco.com/the-5-major-mass-extinctions-4018102?ref=weekly.elfitz.com">https://www.thoughtco.com/the-5-major-mass-extinctions-4018102</a></li><li><a href="https://www.worldatlas.com/articles/the-timeline-of-the-mass-extinction-events-on-earth.html?ref=weekly.elfitz.com">https://www.worldatlas.com/articles/the-timeline-of-the-mass-extinction-events-on-earth.html</a></li><li><a href="https://slate.com/technology/2014/07/the-great-oxygenation-event-the-earths-first-mass-extinction.html?ref=weekly.elfitz.com">https://slate.com/technology/2014/07/the-great-oxygenation-event-the-earths-first-mass-extinction.html</a></li><li><a href="https://ed.ted.com/lessons/how-a-single-celled-organism-almost-wiped-out-life-on-earth-anusuya-willis?ref=weekly.elfitz.com">https://ed.ted.com/lessons/how-a-single-celled-organism-almost-wiped-out-life-on-earth-anusuya-</a></li><li><a href="http://www.devoniantimes.org/opportunity/massExtinction.html?ref=weekly.elfitz.com">http://www.devoniantimes.org/opportunity/massExtinction.html</a></li><li><a href="https://astrobiology.nasa.gov/news/microbial-innovation-causes-the-end-permian-extinction/?ref=weekly.elfitz.com">https://astrobiology.nasa.gov/news/microbial-innovation-causes-the-end-permian-extinction/</a></li><li><a href="https://www.nationalgeographic.com/science/article/permian-extinction?ref=weekly.elfitz.com">https://www.nationalgeographic.com/science/article/permian-extinction</a></li></ul>]]></content:encoded></item><item><title><![CDATA[Escape a newline in bash on macOS]]></title><description><![CDATA[While setting up a new step for a client's CI pipeline on a macOS agent, I needed to export an API key containing line breaks to a JSON file. Meaning I had to escape line breaks, in bash, on macOS. Turns out, it's not as easy as it sounds. Who would've guessed?]]></description><link>https://weekly.elfitz.com/2022/06/16/escape-a-newline-in-bash-on-macos/</link><guid isPermaLink="false">62ab5fa69e55f1003ddf594a</guid><category><![CDATA[shell]]></category><category><![CDATA[Programming]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Thu, 16 Jun 2022 18:04:23 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1535551951406-a19828b0a76b?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDc3fHxjb2RlfGVufDB8fHx8MTY1NTQwMDk3Mw&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1535551951406-a19828b0a76b?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDc3fHxjb2RlfGVufDB8fHx8MTY1NTQwMDk3Mw&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Escape a newline in bash on macOS"><p>It seems obvious, right? Simply do <code>echo $OUR_VAR | sed &apos;s/\n/\\n/g&apos;</code>, &#xA0;right?</p><p>For those of you already convinced it is not that easy, just skip below. For the potential future colleague I need to convince I&apos;m not a hack, let&apos;s go ahead and try that with a simple string from our favorite text generator: <a href="https://slipsum.com/?ref=weekly.elfitz.com">Samuel L Ipsum</a>.</p><pre><code class="language-bash">OUR_VAR=&quot;My money&apos;s in that office, right? If she start giving me some bullshit about it ain&apos;t there, and we got to go someplace else and get it, I&apos;m gonna shoot you in the head then and there. Then I&apos;m gonna shoot that bitch in the kneecaps, find out where my goddamn money is. She gonna tell me too. Hey, look at me when I&apos;m talking to you, motherfucker. You listen: we go in there, and that nigga Winston or anybody else is in there, you the first motherfucker to get shot. You understand?

Normally, both your asses would be dead as fucking fried chicken, but you happen to pull this shit while I&apos;m in a transitional period so I don&apos;t wanna kill you, I wanna help you. But I can&apos;t give you this case, it don&apos;t belong to me. Besides, I&apos;ve already been through too much shit this morning over this case to hand it over to your dumb ass.&quot;

echo $OUR_VAR</code></pre><p>Everything good? Now, let&apos;s escape our newlines.</p><pre><code class="language-bash">echo &quot;$OUR_VAR&quot; | sed &apos;s/\n/\\n/g&apos;</code></pre><blockquote>My money&apos;s in that office, right? If she start giving me some bullshit about it ain&apos;t there, and we got to go someplace else and get it, I&apos;m gonna shoot you in the head then and there. Then I&apos;m gonna shoot that bitch in the kneecaps, find out where my goddamn money is. She gonna tell me too. Hey, look at me when I&apos;m talking to you, motherfucker. You listen: we go in there, and that nigga Winston or anybody else is in there, you the first motherfucker to get shot. You understand?<br> <br>Normally, both your asses would be dead as fucking fried chicken, but you happen to pull this shit while I&apos;m in a transitional period so I don&apos;t wanna kill you, I wanna help you. But I can&apos;t give you this case, it don&apos;t belong to me. Besides, I&apos;ve already been through too much shit this morning over this case to hand it over to your dumb ass.</blockquote><p>Are you getting the same thing? We should be seeing two <code>\n</code> in there between our paragraphs, right?</p><p>I searched around, and mostly found people trying to accomplish the opposite: replacing something with a newline, and finally discovered that, to quote <a href="https://stackoverflow.com/a/1252010/7243001?ref=weekly.elfitz.com">another StackOverflow answer</a></p><blockquote>sed is intended to be used on line-based input. Although it can do what you need.</blockquote><p>Despite trying to adapt the suggested solutions, like using tr, all I got weere extra backslashes in my string.</p><pre><code class="language-bash">echo &quot;$OUR_VAR&quot; | tr &apos;\n&apos; &apos;\\n&apos;</code></pre><blockquote>My money&apos;s in that office, right? If she start giving me some bullshit about it ain&apos;t there, and we got to go someplace else and get it, I&apos;m gonna shoot you in the head then and there. Then I&apos;m gonna shoot that bitch in the kneecaps, find out where my goddamn money is. She gonna tell me too. Hey, look at me when I&apos;m talking to you, motherfucker. You listen: we go in there, and that nigga Winston or anybody else is in there, you the first motherfucker to get shot. You understand?\Normally, both your asses would be dead as fucking fried chicken, but you happen to pull this shit while I&apos;m in a transitional period so I don&apos;t wanna kill you, I wanna help you. But I can&apos;t give you this case, it don&apos;t belong to me. Besides, I&apos;ve already been through too much shit this morning over this case to hand it over to your dumb ass.\</blockquote><p>But I&apos;ll spare you everything else I tried, and skip to the solution. Because we both have better things to do. Like grab a &quot;Proper Hot Chocolate&quot; from one of the best places in Edinburgh: <a href="https://goo.gl/maps/U8sd39BuEpTRVg6Z8?ref=weekly.elfitz.com">Uplands Roast</a> (their mocha are great too).</p><h1 id="how-its-actually-done">How it&apos;s actually done</h1><p>After rephrasing my search queries, and trying something both more and less specific, <a href="https://stackoverflow.com/a/24942262/7243001?ref=weekly.elfitz.com">the answer</a> was finally to be found in Google&apos;s second result&apos;s second subresult.</p><figure class="kg-card kg-image-card"><img src="https://weekly.elfitz.com/content/images/2022/06/Screenshot-2022-06-16-at-18.28.55.png" class="kg-image" alt="Escape a newline in bash on macOS" loading="lazy" width="870" height="699" srcset="https://weekly.elfitz.com/content/images/size/w600/2022/06/Screenshot-2022-06-16-at-18.28.55.png 600w, https://weekly.elfitz.com/content/images/2022/06/Screenshot-2022-06-16-at-18.28.55.png 870w" sizes="(min-width: 720px) 720px"></figure><p>The answer? Change the &apos;output record separator&apos; (ORS in the script) to <code>\\n</code> instead of <code>\n</code> using <a href="https://en.wikipedia.org/wiki/AWK?ref=weekly.elfitz.com">awk</a>:</p><pre><code class="language-bash">echo &quot;$OUR_VAR&quot; | awk -v ORS=&apos;\\n&apos; &apos;1&apos;</code></pre><blockquote>My money&apos;s in that office, right? If she start giving me some bullshit about it ain&apos;t there, and we got to go someplace else and get it, I&apos;m gonna shoot you in the head then and there. Then I&apos;m gonna shoot that bitch in the kneecaps, find out where my goddamn money is. She gonna tell me too. Hey, look at me when I&apos;m talking to you, motherfucker. You listen: we go in there, and that nigga Winston or anybody else is in there, you the first motherfucker to get shot. You understand?\n\nNormally, both your asses would be dead as fucking fried chicken, but you happen to pull this shit while I&apos;m in a transitional period so I don&apos;t wanna kill you, I wanna help you. But I can&apos;t give you this case, it don&apos;t belong to me. Besides, I&apos;ve already been through too much shit this morning over this case to hand it over to your dumb ass.\n</blockquote><p>And there they are! Our gloriously escaped newline characters!</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://images.unsplash.com/photo-1606068498020-f2e881dd7197?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEwNHx8cmFuZG9tfGVufDB8fHx8MTY1NTQwMTk4Mw&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" class="kg-image" alt="Escape a newline in bash on macOS" loading="lazy" width="6000" height="4000" srcset="https://images.unsplash.com/photo-1606068498020-f2e881dd7197?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEwNHx8cmFuZG9tfGVufDB8fHx8MTY1NTQwMTk4Mw&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=600 600w, https://images.unsplash.com/photo-1606068498020-f2e881dd7197?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEwNHx8cmFuZG9tfGVufDB8fHx8MTY1NTQwMTk4Mw&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1000 1000w, https://images.unsplash.com/photo-1606068498020-f2e881dd7197?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEwNHx8cmFuZG9tfGVufDB8fHx8MTY1NTQwMTk4Mw&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1600 1600w, https://images.unsplash.com/photo-1606068498020-f2e881dd7197?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEwNHx8cmFuZG9tfGVufDB8fHx8MTY1NTQwMTk4Mw&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2400 2400w" sizes="(min-width: 720px) 720px"><figcaption>Ah, those newlines! Almost as beautiful as some random Unsplash picture! - Photo by <a href="https://unsplash.com/@thezenoeffect?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Shyam</a> / <a href="https://unsplash.com/?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Unsplash</a></figcaption></figure><h1 id="epilogue">Epilogue</h1><p>That&apos;s it! Enjoy the rest of your day, and the few hours (if you had the same issue and this somehow popped up in the first results) you&apos;ve just saved! &#x1F609;</p>]]></content:encoded></item><item><title><![CDATA[Build OpenCV Contrib as a .xcframework, for iOS, on a M1]]></title><description><![CDATA[We needed an OpenCV Contrib module in an iOS app, using Swift Package Manager. I thought the hardest part would be using OpenCV. How wrong I was.]]></description><link>https://weekly.elfitz.com/2022/06/14/build-opencv-contrib-for-ios/</link><guid isPermaLink="false">62991d6b21ab0d004dd0fa4c</guid><category><![CDATA[Swift Programming]]></category><category><![CDATA[iOS Development]]></category><category><![CDATA[Programming]]></category><category><![CDATA[macOS]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Tue, 14 Jun 2022 15:00:39 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1620712943543-bcc4688e7485?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDF8fGFpfGVufDB8fHx8MTY1NDIwMzM5OQ&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<h1 id="context">Context</h1><img src="https://images.unsplash.com/photo-1620712943543-bcc4688e7485?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDF8fGFpfGVufDB8fHx8MTY1NDIwMzM5OQ&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Build OpenCV Contrib as a .xcframework, for iOS, on a M1"><p>We needed an OpenCV Contrib module in an iOS app. When reviewing options, it was decided we would expose a Swift package, wrapping an Objective-C wrapper, that would itself be the one importing OpenCV.</p><p>This meant exposing OpenCV as a <a href="https://www.google.com/search?client=safari&amp;rls=en&amp;q=swift+package+manager&amp;ie=UTF-8&amp;oe=UTF-8&amp;ref=weekly.elfitz.com">Swift Package Manager</a> <a href="https://developer.apple.com/documentation/swift_packages/distributing_binary_frameworks_as_swift_packages?ref=weekly.elfitz.com">binary target</a>. Which requires us to build a <a href="https://blog.embrace.io/xcode-12-and-xcframework/?ref=weekly.elfitz.com">.xcframework</a>.</p><h1 id="goal">Goal</h1><p>Build a <code>.xcframework</code> that we can run on both OS devices and Intel &amp; Apple M1 Mac iPhone / iPad Simulators. Why a <code>.xcframework</code> and not simply a <code>.framework</code>? Because while Swift Package Manager can import binaries, it can only import <code>.xcframework</code> ones, not <code>.framework</code> ones.</p><h1 id="requirements">Requirements</h1><ul><li><a href="https://cmake.org/?ref=weekly.elfitz.com">cmake</a> (<a href="https://formulae.brew.sh/formula/cmake?ref=weekly.elfitz.com">installable using brew</a>)</li><li>python3 &#xA0;(because python2.7 is obsolete) &#xA0;(installable using brew)</li><li>opencv source &#xA0;(via <code>git clone https://github.com/opencv/opencv.git</code>)</li><li>opencv_contrib source (via <code>git clone https://github.com/opencv/opencv_contrib.git</code>)</li></ul><h1 id="building">Building</h1><p>Build OpenCV using the proper flags (outfile, contrib, iphoneos arch, iphonesimulator archs):</p><pre><code class="language-bash">python3 opencv/platforms/apple/build_xcframework.py \
--out ./opencv-build \
--contrib opencv_contrib \
--iphoneos_archs armv7,armv7s,arm64 \
--iphonesimulator_archs arm64 \
--build_only_specified_archs</code></pre><p>Where the <code>--contrib</code> parameter&apos;s value is the local path to the opencv_contrib source.</p><div class="kg-card kg-callout-card kg-callout-card-grey"><div class="kg-callout-emoji">&#x1F4A1;</div><div class="kg-callout-text">I tried building it for both `x86_64` and `arm64` simulators, but it simply wouldn&apos;t build, probably because my M1 Mac has an `arm64` CPU architecture. An Intel Mac (or an M1 one running Rosetta) may not be able to build for the `arm64` simulator. I will look into that in the future.</div></div><h1 id="go-make-yourself-a-nice-and-warm-cup-of-tea-%F0%9F%AB%96-because-i-currently-am-writing-this-in-the-uk-%F0%9F%87%AC%F0%9F%87%A7">Go make yourself a nice and warm cup of tea &#x1FAD6; (because I currently am writing this in the UK &#x1F1EC;&#x1F1E7; )</h1><p>With a dash of whisky in it &#x1F943;, because I&apos;m more specifically in Scotland &#x1F3F4;&#xE0067;&#xE0062;&#xE0073;&#xE0063;&#xE0074;&#xE007F; at the moment</p><h1 id="wait">Wait</h1><p>... Wait</p><p>... Wait some more</p><p><strong>Your tea should be ready now, more or less</strong></p><p><em>Wait</em></p><p>... ... <em>Wait</em></p><h5 id="wait-1">Wait</h5><p><strong><em>Wait some more</em></strong></p><p>Wait</p><h1 id="you-can-now-carefully-sip-your-tea">You can now carefully sip your tea</h1><pre><code class="language-bash">cd /Users/path/opencv-build/iphonesimulator/build/build-arm64-iphonesimulator/modules/objc_bindings_generator/ios/gen
    export LANG\=en_US.US-ASCII
    /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -x objective-c++ -target arm64-apple-ios9.0-simulator -fmessage-length\=122 -fdiagnostics-show-note-include-stack -fmacro-backtrace-limit\=0 -fcolor-diagnostics -Wno-trigraphs -fpascal-strings -O1 -Wno-missing-field-initializers -Wno-missing-prototypes -Wno-return-type -Wno-implicit-atomic-properties -Wno-objc-interface-ivars -Wno-arc-repeated-use-of-weak -Wno-non-virtual-dtor -Wno-overloaded-virtual -Wno-exit-time-destructors -Wno-missing-braces -Wparentheses -Wswitch -Wno-unused-function -Wno-unused-label -Wno-unused-parameter -Wno-unused-variable -Wunused-value -Wno-emp</code></pre><p>Sip some tea.</p><p>Wait.</p><p>Did you know <a href="https://www.stashtea.com/blogs/education/tea-types?ref=weekly.elfitz.com">there are over </a><strong><a href="https://www.stashtea.com/blogs/education/tea-types?ref=weekly.elfitz.com">3000</a></strong><a href="https://www.stashtea.com/blogs/education/tea-types?ref=weekly.elfitz.com"> varieties of tea out there</a>? I certainly didn&apos;t. Until I looked it up to give you some random, vaguely appropriate, fun fact.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://images.unsplash.com/photo-1437315306147-0923bdb3fc12?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDE0fHx0ZWElMjBsZWF2ZXN8ZW58MHx8fHwxNjU0MjAyODg4&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" class="kg-image" alt="Build OpenCV Contrib as a .xcframework, for iOS, on a M1" loading="lazy" width="5376" height="3025" srcset="https://images.unsplash.com/photo-1437315306147-0923bdb3fc12?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDE0fHx0ZWElMjBsZWF2ZXN8ZW58MHx8fHwxNjU0MjAyODg4&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=600 600w, https://images.unsplash.com/photo-1437315306147-0923bdb3fc12?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDE0fHx0ZWElMjBsZWF2ZXN8ZW58MHx8fHwxNjU0MjAyODg4&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1000 1000w, https://images.unsplash.com/photo-1437315306147-0923bdb3fc12?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDE0fHx0ZWElMjBsZWF2ZXN8ZW58MHx8fHwxNjU0MjAyODg4&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1600 1600w, https://images.unsplash.com/photo-1437315306147-0923bdb3fc12?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDE0fHx0ZWElMjBsZWF2ZXN8ZW58MHx8fHwxNjU0MjAyODg4&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2400 2400w" sizes="(min-width: 720px) 720px"><figcaption>Photo by <a href="https://unsplash.com/@tsaiga?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">&#x8521; &#x5609;&#x5B87;</a> / <a href="https://unsplash.com/?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Unsplash</a></figcaption></figure><p> </p><h1 id="go-buy-yourself-some-nice-biscuits-to-go-with-your-tea">Go buy yourself some nice biscuits to go with your tea</h1><pre><code class="language-bash">-o /Users/path/opencv-build/iphonesimulator/build/build-arm64-iphonesimulator/modules/objc/framework_build/opencv2.build/Release-iphonesimulator/opencv2.build/Objects-normal/arm64/AffineFeature.o

CompileC /Users/path/opencv-build/iphonesimulator/build/build-arm64-iphonesimulator/modules/objc/framework_build/opencv2.build/Release-iphonesimulator/opencv2.build/Objects-normal/arm64/AdaptiveManifoldFilter.o /Users/path/opencv-build/build-arm64-iphonesimulator/modules/objc_bindings_generator/ios/gen/objc/ximgproc/AdaptiveManifoldFilter.mm normal arm64 objective-c++ com.apple.compilers.llvm.clang.1_0.compiler (in target &apos;opencv2&apos; from project &apos;opencv2&apos;)
    cd /Users/path/opencv-build/iphonesimulator/build/build-arm64-iphonesimulator/modules/objc_bindings_generator/ios/gen
    export LANG\=en_US.US-ASCII
    /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -x objective-c++ -target arm64-apple-ios9.0-simulator</code></pre><h1 id="sip-your-tea-some-more-now-with-biscuits">Sip your tea some more, now with biscuits</h1><h5 id="wait-2">Wait</h5><p></p><h5 id="wait-3">... Wait</h5><p></p><h1 id="put-the-cup-of-tea-down">Put the cup of tea down...</h1><pre><code class="language-bash">Executing: [&apos;cmake&apos;, &apos;-DBUILD_TYPE=Release&apos;, &apos;-DCMAKE_INSTALL_PREFIX=/Users/username/path/opencv-build/iphonesimulator/build/build-arm64-iphonesimulator/install&apos;, &apos;-P&apos;, &apos;cmake_install.cmake&apos;] in /Users/path/opencv-build/iphonesimulator/build/build-arm64-iphonesimulator/modules/objc/framework_build
Executing: cmake -DBUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=/Users/path/opencv-build/iphonesimulator/build/build-arm64-iphonesimulator/install -P cmake_install.cmake
-- Install configuration: &quot;Release&quot;</code></pre><p>... lay back in your chair...</p><p>... and do a 360!</p><!--kg-card-begin: html--><div style="width:100%;height:0;padding-bottom:56%;position:relative;"><iframe src="https://giphy.com/embed/LdCEoNk88QAJk3estX" width="100%" height="100%" style="position:absolute" frameborder="0" class="giphy-embed" allowfullscreen></iframe></div><p><a href="https://giphy.com/gifs/HannahWitton-hannah-witton-LdCEoNk88QAJk3estX?ref=weekly.elfitz.com"></a></p><!--kg-card-end: html--><!--kg-card-begin: html--><div style="width:100%;height:0;padding-bottom:178%;position:relative;"><iframe src="https://giphy.com/embed/3og0ILE8o6MzZsSWg8" width="100%" height="100%" style="position:absolute" frameborder="0" class="giphy-embed" allowfullscreen></iframe></div><p><a href="https://giphy.com/gifs/nickmurthwaite-spin-idiot-3og0ILE8o6MzZsSWg8?ref=weekly.elfitz.com"></a></p><!--kg-card-end: html--><h1 id="turn-around-some-more">Turn around some more!</h1><!--kg-card-begin: html--><div style="width:100%;height:0;padding-bottom:100%;position:relative;"><iframe src="https://giphy.com/embed/s5FhlJsjiT5aolWEf7" width="100%" height="100%" style="position:absolute" frameborder="0" class="giphy-embed" allowfullscreen></iframe></div><p><a href="https://giphy.com/gifs/nickelodeon-creepy-doll-warped-s5FhlJsjiT5aolWEf7?ref=weekly.elfitz.com">via GIPHY</a></p><!--kg-card-end: html--><p>Oh. That got creepy.</p><p>Oh, it&apos;s done building! Congrats!</p><pre><code class="language-bash">============================================================
Finished building ./opencv-build/opencv2.xcframework
============================================================</code></pre><p>That wasn&apos;t so hard now, was it? Now you can use OpenCV contrib modules in your app! Isn&apos;t that nice! Next, how to import it into our projects as a binary target using Swift Package Manager!</p><h1 id="resources">Resources:</h1><ul><li><a href="https://docs.opencv.org/4.x/d5/da3/tutorial_ios_install.html?ref=weekly.elfitz.com">https://docs.opencv.org/4.x/d5/da3/tutorial_ios_install.html</a></li><li><a href="https://docs.opencv.org/4.x/d7/d88/tutorial_hello.html?ref=weekly.elfitz.com">https://docs.opencv.org/4.x/d7/d88/tutorial_hello.html</a></li><li><a href="https://fossies.org/linux/opencv/platforms/apple/readme.md?ref=weekly.elfitz.com">https://fossies.org/linux/opencv/platforms/apple/readme.md</a></li><li><a href="https://stackoverflow.com/questions/44584343/how-to-build-opencv-in-ios?ref=weekly.elfitz.com">https://stackoverflow.com/questions/44584343/how-to-build-opencv-in-ios</a></li><li><a href="https://developer.apple.com/forums/thread/666335?ref=weekly.elfitz.com">https://developer.apple.com/forums/thread/666335</a></li><li><a href="https://stackoverflow.com/questions/66197098/import-framework-inside-spm-swift-package-manager?ref=weekly.elfitz.com">https://stackoverflow.com/questions/66197098/import-framework-inside-spm-swift-package-manager</a></li><li><a href="https://forums.swift.org/t/how-to-import-prebuilt-frameworks-from-local-project-in-swift-package/35985?ref=weekly.elfitz.com">https://forums.swift.org/t/how-to-import-prebuilt-frameworks-from-local-project-in-swift-package/35985</a></li><li><a href="https://github.com/ezhes/sendbird-ios-xcframework?ref=weekly.elfitz.com">https://github.com/ezhes/sendbird-ios-xcframework</a></li><li><a href="https://github.com/spouliot/xcframework?ref=weekly.elfitz.com">https://github.com/spouliot/xcframework</a></li><li><a href="https://fossies.org/linux/opencv/platforms/apple/readme.md?ref=weekly.elfitz.com">https://fossies.org/linux/opencv/platforms/apple/readme.md</a></li><li><a href="https://vovkos.github.io/doxyrest-showcase/opencv/sphinx_rtd_theme/page_tutorial_ios_install.html?ref=weekly.elfitz.com">https://vovkos.github.io/doxyrest-showcase/opencv/sphinx_rtd_theme/page_tutorial_ios_install.html</a></li></ul>]]></content:encoded></item><item><title><![CDATA[Work is a market]]></title><description><![CDATA[With remote work on the rise, I've seen people debating the fairness of companies adjusting pay based on one's location and cost of living. As if there ever was an intrinsic justification to compensation.

Spoiler: there isn't.]]></description><link>https://weekly.elfitz.com/2022/06/07/work-is-a-market/</link><guid isPermaLink="false">61b5f90c61a8d9003b97b825</guid><category><![CDATA[Thoughts]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Tue, 07 Jun 2022 15:00:54 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1540566472852-b82269db3063?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDIzfHxPbGQlMjBtYW58ZW58MHx8fHwxNjM5MzE3NDY2&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<h2 id="introduction">Introduction</h2><img src="https://images.unsplash.com/photo-1540566472852-b82269db3063?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDIzfHxPbGQlMjBtYW58ZW58MHx8fHwxNjM5MzE3NDY2&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Work is a market"><p>A few years ago, the building I lived in in the Paris area was undergoing some renovation. Every day, men shoveled gravel, broke asphalt, cut concrete panels and laid them down.</p><p>Amongst them, there was an old man. There was a grandfatherly air to him. An old man with the heavy build of those who eat too much. The belly of one that drinks too much beer. Discheveled grey hair, a short unkempt beard, and the big reddish nose typical of those who drank too much alcohol for too many years.</p><p>While much younger ones manoeuvered a small excavator to break down the parking lot&apos;s coating, he filled wheelbarrows of dirt with a shovel. While much younger ones unrolled layers of waterproofing coatings, he carried his heavy wheelbarrow across the whole parking lot.</p><p>Seeing him filled me with a strange sort of sadness. There he was, at the end of his work life, and still he had to put his aging body through this rough, physically damaging, ordeal.</p><p>And all for what? Minimum wage? One and a half times minimum wage? Was that what we paid those who rebuild our buildings&apos; parking lots, our streets? Those who actually build and maintain all the infrastructure we all need in our daily lives, at the cost of ruined joints, arthritis, their whole physical health? Why were they paid so little compared to I, who merely contributed to vaguely useful dating apps or grocery delivery platforms?</p><p>Why was I, back when I worked at McDonald&apos;s, when I actually <em>made</em> food for hundreds of people every day, burning my hands and forearms several times a week, paid so little compared to now?</p><h2 id="return-on-investment">Return on Investment</h2><p>The first reason is &#xA0;scalability. Write once, run everywhere. Software&apos;s innate scalability in today&apos;s connected world means that what I am paid to build can sold or used to sell to thousands, hundreds of thousands or millions of customers. Meaning that <em>if</em> the software has a market fit <em>and</em> management is reasonably smart, the product will probably bring in much more revenue than whatever cost my work could represent.</p><h2 id="supply-and-demand">Supply and demand</h2><p>The second reason is that, today, work is a market. We are paid as little to do something as enough of us are willing to get paid to do it. It is not a matter of the skills involved, the knowledge required, or the intrinsic value of what is accomplished through that work. It is purely a matter of supply and demand.</p><h2 id="work-is-a-market-governed-by-supply-demand-and-return-on-investment">Work is a market, governed by supply, demand, and return on investment</h2><p>Many of us are willing to get paid very little to ruin our back pushing wheelbarrows full of rocks or burn our hands and arms making hamburgers. Sure <em>most</em> of us won&apos;t like it. But <em>enough</em> are willing to nonetheless.</p><p>On the other hand, some <em>&quot;highly skilled&quot;</em> jobs pay much more than one would earn by working as a construction worker or fast food employee. Still, why am <em>I</em>, as a software engineer, paid much more than most teachers in the world? Why are medical interns with over <em>6 years</em> of study, and actually saving lives on a regular basis, only paid as much as Paris&apos; garbage collectors?</p><p>If some skills and knowledge are valued today and lead to better pay, it&apos;s only because enough of two reasons:</p><ul><li>the work that can be produced using these skills and knowledge can be used, one way or another, to bring in more revenue than what the people with these skills and knowledge are paid to do that work</li><li>most, <em>or enough</em> of the people with those skills and knowledge won&apos;t work for less</li></ul><h2 id="conclusion">Conclusion</h2><p>A degree doesn&apos;t mean we&apos;ll get a better or better paying job. Skills won&apos;t either.</p><p>The only reasons I am paid more today building and maintaining apps and backends than I was years ago flipping burgers, is because there are fewer people who know how to create software than there who can make a burger, enough of us aren&apos;t willing to paid less to do our job, and somehow what I make and fix is worth more than what I cost.</p><p>But if we don&apos;t get yourself skills that are in demand, and ways to prove we have them, then all we have to offer is that we are cheaper than the automated alternative, are easier trained for a job than a monkey, and have a working body we are willing to ruin for a price.</p>]]></content:encoded></item><item><title><![CDATA[Add a height-flexible placeholder to your UITextView]]></title><description><![CDATA[<p>Once you have tasted SwiftUI, this kind of UIKit lacking makes you wish you could refactor a client&apos;s whole codebase at a snap from your fingers. At least the UI side of it. But, alas, there is no magic spell for that. Yet.</p><p>What kind of lacking? <code>UITextView</code></p>]]></description><link>https://weekly.elfitz.com/2022/05/31/add-a-height-flexible-placeholder-to-your-uitextview/</link><guid isPermaLink="false">62019443ce25b6003bfbc5ee</guid><category><![CDATA[User Interface]]></category><category><![CDATA[Swift Programming]]></category><category><![CDATA[iOS Development]]></category><category><![CDATA[Mobile Development]]></category><category><![CDATA[Programming]]></category><dc:creator><![CDATA[ElFitz]]></dc:creator><pubDate>Tue, 31 May 2022 15:00:44 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1516131206008-dd041a9764fd?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEzN3x8TW9ja3VwfGVufDB8fHx8MTY0NDI3MDk2OA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1516131206008-dd041a9764fd?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEzN3x8TW9ja3VwfGVufDB8fHx8MTY0NDI3MDk2OA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Add a height-flexible placeholder to your UITextView"><p>Once you have tasted SwiftUI, this kind of UIKit lacking makes you wish you could refactor a client&apos;s whole codebase at a snap from your fingers. At least the UI side of it. But, alas, there is no magic spell for that. Yet.</p><p>What kind of lacking? <code>UITextView</code> doesn&apos;t have a placeholder property, and no sensible way of setting one up is provided.</p><p>So, how do we do it? While some would suggest <a href="https://stackoverflow.com/a/27652289/7243001?ref=weekly.elfitz.com">setting the <code>UITextView</code>&apos;s text property to your desired placeholder, add changing the text&apos;s style back and forth</a>, I prefer the <a href="https://stackoverflow.com/a/28271069/7243001?ref=weekly.elfitz.com">floating placeholder</a> approach. Mainly because I prefer something self-contained and easily reusable.</p><p>So, let&apos;s make just that, shall we?</p><h1 id="add-the-placeholder-uilabel">Add the placeholder UILabel</h1><p>We create a new <code>UILabel</code>, and add it as one of our <code>UITextView</code>&apos;s subviews. In order to make it easily reusable, we add this in a <code>UITextView</code> extension.</p><p>To keep things simple for the rest of our app, we&apos;ll add a computed <code>placeholder</code> string property, used to set &amp; retrieve our placeholder text. This will take care of creating our <code>placeHolderLabel</code> if we try to set some placeholder text and the label doesn&apos;t already exist.</p><pre><code class="language-swift">import UIKit

extension UITextView {

    public var placeholder: String? {
        get {
            self.placeholderLabel?.text
        }
        set {
            if let placeholderLabel = placeholderLabel {
                placeholderLabel.text = newValue
            } else {
                self.addPlaceholderLabel().text = newValue
            }
        }
    }

    private var placeholderLabel: UILabel? {
        get {
            self.viewWithTag(placeholderLabelViewTag) as? UILabel
        }
    }

    private var placeholderLabelViewTag: Int {
        100
    }

    fileprivate func addPlaceholderLabel() -&gt; UILabel {
        let newPlaceholderLabel = UILabel()
        self.setPlaceholderLabelTextConfig(label: newPlaceholderLabel)
        self.addSubview(newPlaceholderLabel)
        return newPlaceholderLabel
    }

    fileprivate func setPlaceholderLabelTextConfig(label: UILabel) {
        label.tag = placeholderLabelViewTag
    }
}</code></pre><h1 id="configure-our-uilabels-text-appearance">Configure our UILabel&apos;s text appearance</h1><p>Our placeholder label&apos;s text should have a distinct appearance</p><pre><code class="language-swift">import UIKit

extension UITextView {

    public var placeholder: String? {
        get {
            self.placeholderLabel?.text
        }
        set {
            (self.placeholderLabel ?? addPlaceholderLabel()).text = newValue
        }
    }

    private var placeholderLabel: UILabel? {
        get {
            self.viewWithTag(placeholderLabelViewTag) as? UILabel
        }
    }

    private var defaultPlaceholderTextColor: UIColor {
        if #available(iOS 13.0, *) {
            return UIColor.systemGray3
        } else {
            return UIColor.lightGray
        }
    }

    private var placeholderLabelViewTag: Int {
        100
    }

    fileprivate func addPlaceholderLabel() -&gt; UILabel {
        let newPlaceholderLabel = UILabel()
        self.setPlaceholderLabelTextConfig(label: newPlaceholderLabel)
        self.addSubview(newPlaceholderLabel)
        return newPlaceholderLabel
    }

    fileprivate func setPlaceholderLabelTextConfig(label: UILabel) {
        label.font = self.font
        label.lineBreakMode = .byWordWrapping
        label.allowsDefaultTighteningForTruncation = true
        label.adjustsFontSizeToFitWidth = true
        label.numberOfLines = 0
        label.tag = placeholderLabelViewTag
        label.isHidden = !self.text.isEmpty
        label.textColor = defaultPlaceholderTextColor
    }
}</code></pre><h1 id="hide-the-placeholder-when-it-is-not-needed">Hide the placeholder when it is not needed</h1><p>(ie when the UITextView actually displays any text)</p><pre><code class="language-swift">import UIKit

extension UITextView {

    public var placeholder: String? {
        get {
            self.placeholderLabel?.text
        }
        set {
            (self.placeholderLabel ?? addPlaceholderLabel()).text = newValue
        }
    }

    private var placeholderLabel: UILabel? {
        get {
            self.viewWithTag(placeholderLabelViewTag) as? UILabel
        }
    }

    private var defaultPlaceholderTextColor: UIColor {
        if #available(iOS 13.0, *) {
            return UIColor.systemGray3
        } else {
            return UIColor.lightGray
        }
    }

    private var placeholderLabelViewTag: Int {
        100
    }

    fileprivate func addPlaceholderLabel() -&gt; UILabel {
        let newPlaceholderLabel = UILabel()
        self.setPlaceholderLabelTextConfig(label: newPlaceholderLabel)
        self.addSubview(newPlaceholderLabel)
        self.setupPlaceholderTextViewObserver()
        return newPlaceholderLabel
    }

    fileprivate func setPlaceholderLabelTextConfig(label: UILabel) {
        label.font = self.font
        label.lineBreakMode = .byWordWrapping
        label.allowsDefaultTighteningForTruncation = true
        label.adjustsFontSizeToFitWidth = true
        label.numberOfLines = 0
        label.tag = placeholderLabelViewTag
        label.isHidden = !self.text.isEmpty
        label.textColor = defaultPlaceholderTextColor
    }

    fileprivate func setupPlaceholderTextViewObserver() {
        NotificationCenter.default.addObserver(
            self,
            selector: #selector(textViewDidChange),
            name: UITextView.textDidChangeNotification,
            object: nil
        )
    }

    @objc public func textViewDidChange() {
        self.placeholderLabel?.isHidden = !self.text.isEmpty
    }
}</code></pre><h1 id="set-its-size-programmatically-using-autolayout-constraints">Set its size programmatically using AutoLayout constraints</h1><p>So that it correctly fits over our <code>UITextView</code></p><pre><code class="language-swift">import UIKit

extension UITextView {

    public var placeholder: String? {
        get {
            self.placeholderLabel?.text
        }
        set {
            (self.placeholderLabel ?? addPlaceholderLabel()).text = newValue
        }
    }

    private var placeholderLabel: UILabel? {
        get {
            self.viewWithTag(placeholderLabelViewTag) as? UILabel
        }
    }

    private var defaultPlaceholderTextColor: UIColor {
        if #available(iOS 13.0, *) {
            return UIColor.systemGray3
        } else {
            return UIColor.lightGray
        }
    }

    private var placeholderLabelViewTag: Int {
        100
    }

    fileprivate func addPlaceholderLabel() -&gt; UILabel {
        let newPlaceholderLabel = UILabel()
        self.setPlaceholderLabelTextConfig(label: newPlaceholderLabel)
        self.addSubview(newPlaceholderLabel)
        self.setPlaceholderLabelConstraints(label: newPlaceholderLabel)
        self.setupPlaceholderTextViewObserver()
        return newPlaceholderLabel
    }

    fileprivate func setPlaceholderLabelTextConfig(label: UILabel) {
        label.font = self.font
        label.lineBreakMode = .byWordWrapping
        label.allowsDefaultTighteningForTruncation = true
        label.adjustsFontSizeToFitWidth = true
        label.numberOfLines = 0
        label.tag = placeholderLabelViewTag
        label.isHidden = !self.text.isEmpty
        label.textColor = defaultPlaceholderTextColor
    }

    fileprivate func setPlaceholderLabelConstraints(label: UILabel) {
        label.translatesAutoresizingMaskIntoConstraints = false
        let placeholderLabelPadding = self.textContainer.lineFragmentPadding
        let placeholderConstraints = [
            NSLayoutConstraint(item: label, attribute: .left, relatedBy: .equal, toItem: self, attribute: .left, multiplier: 1, constant: placeholderLabelPadding + 9),
            NSLayoutConstraint(item: label, attribute: .right, relatedBy: .equal, toItem: self, attribute: .right, multiplier: 1, constant: -(placeholderLabelPadding + 9)),
            NSLayoutConstraint(item: label, attribute: .top, relatedBy: .equal, toItem: self, attribute: .top, multiplier: 1, constant: placeholderLabelPadding),
            NSLayoutConstraint(item: label, attribute: .bottom, relatedBy: .lessThanOrEqual, toItem: self, attribute: .bottom, multiplier: 1, constant: -placeholderLabelPadding)
        ]
        for constraint in placeholderConstraints {
            constraint.priority = .required
        }
        NSLayoutConstraint.activate(placeholderConstraints)
    }

    fileprivate func setupPlaceholderTextViewObserver() {
        NotificationCenter.default.addObserver(
            self,
            selector: #selector(textViewDidChange),
            name: UITextView.textDidChangeNotification,
            object: nil
        )
    }

    @objc public func textViewDidChange() {
        self.placeholderLabel?.isHidden = !self.text.isEmpty
    }
}</code></pre><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://stackoverflow.com/questions/26180822/how-to-add-constraints-programmatically-using-swift?ref=weekly.elfitz.com"><div class="kg-bookmark-content"><div class="kg-bookmark-title">How to add constraints programmatically using Swift</div><div class="kg-bookmark-description">I&#x2019;m trying to figure this out since last week without going any step further. Ok, so I need to apply some constraints programmatically in Swift to a UIView using this code: var new_view:UIView! = ...</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://cdn.sstatic.net/Sites/stackoverflow/Img/apple-touch-icon.png?v=c78bd457575a" alt="Add a height-flexible placeholder to your UITextView"><span class="kg-bookmark-author">Stack Overflow</span><span class="kg-bookmark-publisher">Sara Canducci</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://cdn.sstatic.net/Sites/stackoverflow/Img/apple-touch-icon@2.png?v=73d79a89bded" alt="Add a height-flexible placeholder to your UITextView"></div></a><figcaption>If you, like me, sometimes forget how it&apos;s done, here&apos;s a refresher!</figcaption></figure><h1 id="optionally-simplify-configuring-our-placeholder-should-we-have-unexpected-needs">(optionally) Simplify configuring our placeholder should we have unexpected needs</h1><p>Perhaps we may, at some point, want a different font, or linebreak mode, or event colour, for our placeholder, instead of using the <code>UITextView</code> or the (sensible) defaults we&apos;ve used. To avoid ending up creating some random custom <code>UITextView</code> logic in different areas of our app, let&apos;s provide a simple method to customize our placeholder.</p><pre><code class="language-swift">import UIKit

extension UITextView {

    public struct PlaceholderConfig {
        let font: UIFont?
        let lineBreakMode: NSLineBreakMode
        let allowsDefaultTighteningForTruncation: Bool
        let adjustsFontSizeToFitWidth: Bool
        let numberOfLines: Int
        let textColor: UIColor?
    }

    /// The UITextView placeholder text
    public var placeholder: String? {
        get {
            self.placeholderLabel?.text
        }
        set {
            (self.placeholderLabel ?? addPlaceholderLabel())
        }
    }

    public func configPlaceholder(placeholder: String?, placeholderConfig: PlaceholderConfig?) {
        self.addPlaceholderLabel(placeholderConfig: placeholderConfig).text = placeholder
    }

    private var placeholderLabel: UILabel? {
        get {
            self.viewWithTag(placeholderLabelViewTag) as? UILabel
        }
    }

    private var defaultPlaceholderTextColor: UIColor {
        if #available(iOS 13.0, *) {
            return UIColor.systemGray3
        } else {
            return UIColor.lightGray
        }
    }

    private var defaultPlaceholderConfig: PlaceholderConfig {
        PlaceholderConfig(
            font: self.font,
            lineBreakMode: .byWordWrapping,
            allowsDefaultTighteningForTruncation: true,
            adjustsFontSizeToFitWidth: true,
            numberOfLines: 0,
            textColor: self.defaultPlaceholderTextColor
        )
    }

    private var placeholderLabelViewTag: Int {
        100
    }

    fileprivate func addPlaceholderLabel(placeholderConfig: PlaceholderConfig? = nil) -&gt; UILabel {
        let newPlaceholderLabel = UILabel()
        self.setPlaceholderLabelTextConfig(label: newPlaceholderLabel, placeholderConfig: placeholderConfig)
        self.addSubview(newPlaceholderLabel)
        self.setPlaceholderLabelConstraints(label: newPlaceholderLabel)
        self.setupPlaceholderTextViewObserver()
        return newPlaceholderLabel
    }

    fileprivate func setPlaceholderLabelTextConfig(label: UILabel, placeholderConfig: PlaceholderConfig?) {
        let configForPlaceholder = placeholderConfig ?? defaultPlaceholderConfig
        label.font = configForPlaceholder.font
        label.lineBreakMode = configForPlaceholder.lineBreakMode
        label.allowsDefaultTighteningForTruncation = configForPlaceholder.allowsDefaultTighteningForTruncation
        label.adjustsFontSizeToFitWidth = configForPlaceholder.adjustsFontSizeToFitWidth
        label.numberOfLines = configForPlaceholder.numberOfLines
        label.tag = placeholderLabelViewTag
        label.isHidden = !self.text.isEmpty
        label.textColor = configForPlaceholder.textColor
    }

    fileprivate func setPlaceholderLabelConstraints(label: UILabel) {
        label.translatesAutoresizingMaskIntoConstraints = false
        let placeholderLabelPadding = self.textContainer.lineFragmentPadding
        let placeholderConstraints = [
            NSLayoutConstraint(item: label, attribute: .left, relatedBy: .equal, toItem: self, attribute: .left, multiplier: 1, constant: placeholderLabelPadding + 9),
            NSLayoutConstraint(item: label, attribute: .right, relatedBy: .equal, toItem: self, attribute: .right, multiplier: 1, constant: -(placeholderLabelPadding + 9)),
            NSLayoutConstraint(item: label, attribute: .top, relatedBy: .equal, toItem: self, attribute: .top, multiplier: 1, constant: placeholderLabelPadding),
            NSLayoutConstraint(item: label, attribute: .bottom, relatedBy: .lessThanOrEqual, toItem: self, attribute: .bottom, multiplier: 1, constant: -placeholderLabelPadding)
        ]
        for constraint in placeholderConstraints {
            constraint.priority = .required
        }
        NSLayoutConstraint.activate(placeholderConstraints)
    }

    fileprivate func setupPlaceholderTextViewObserver() {
        NotificationCenter.default.addObserver(
            self,
            selector: #selector(textViewDidChange),
            name: UITextView.textDidChangeNotification,
            object: nil
        )
    }

    @objc public func textViewDidChange() {
        self.placeholderLabel?.isHidden = !self.text.isEmpty
    }
}</code></pre><hr><h1 id="epilogue">Epilogue</h1><p>And that&apos;s it! Hope you liked it, and that it will prove useful to you! Enjoy your day, and, as usual, should you encounter any issues or have any suggestions for improvements, <a href="https://twitter.com/ElFitz_?ref=weekly.elfitz.com">reach out to me on Twitter!</a></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://images.unsplash.com/photo-1501438400798-b40ff50396c8?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEwfHxiZWFjaCUyMGNvY2t0YWlsfGVufDB8fHx8MTY1MzY2Nzg1NA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" class="kg-image" alt="Add a height-flexible placeholder to your UITextView" loading="lazy" width="6886" height="4591" srcset="https://images.unsplash.com/photo-1501438400798-b40ff50396c8?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEwfHxiZWFjaCUyMGNvY2t0YWlsfGVufDB8fHx8MTY1MzY2Nzg1NA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=600 600w, https://images.unsplash.com/photo-1501438400798-b40ff50396c8?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEwfHxiZWFjaCUyMGNvY2t0YWlsfGVufDB8fHx8MTY1MzY2Nzg1NA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1000 1000w, https://images.unsplash.com/photo-1501438400798-b40ff50396c8?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEwfHxiZWFjaCUyMGNvY2t0YWlsfGVufDB8fHx8MTY1MzY2Nzg1NA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=1600 1600w, https://images.unsplash.com/photo-1501438400798-b40ff50396c8?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDEwfHxiZWFjaCUyMGNvY2t0YWlsfGVufDB8fHx8MTY1MzY2Nzg1NA&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2400 2400w" sizes="(min-width: 720px) 720px"><figcaption>Photo by <a href="https://unsplash.com/@eviradauscher?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Evi Radauscher</a> / <a href="https://unsplash.com/?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Unsplash</a></figcaption></figure>]]></content:encoded></item></channel></rss>