<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Stories by Sebastián Ramírez on Medium]]></title>
        <description><![CDATA[Stories by Sebastián Ramírez on Medium]]></description>
        <link>https://medium.com/@tiangolo?source=rss-963974981597------2</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Sun, 12 Apr 2026 17:59:54 GMT</lastBuildDate>
        <atom:link href="https://medium.com/@tiangolo/feed" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[The Future of Education and Art]]></title>
            <link>https://tiangolo.medium.com/the-future-of-education-and-art-e515f664b778?source=rss-963974981597------2</link>
            <guid isPermaLink="false">https://medium.com/p/e515f664b778</guid>
            <category><![CDATA[education]]></category>
            <category><![CDATA[jobs]]></category>
            <category><![CDATA[art]]></category>
            <dc:creator><![CDATA[Sebastián Ramírez]]></dc:creator>
            <pubDate>Wed, 17 Aug 2022 10:59:30 GMT</pubDate>
            <atom:updated>2022-08-17T11:02:22.733Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*gXY6kHjdF_5m3GIlHEstZA.png" /></figure><p>This article lives in:</p><ul><li><a href="https://dev.to/tiangolo/the-future-of-education-and-art-4h0e">Dev.to</a></li><li><a href="https://medium.com/@tiangolo/the-future-of-education-and-art-e515f664b778">Medium</a></li><li><a href="https://github.com/tiangolo/blog-posts/tree/master/the-future-of-education-and-art">GitHub</a></li></ul><h3>Intro</h3><p>The current <strong>traditional education system is somewhat broken</strong>.</p><p>The <strong>jobs</strong> and requirements based on this system are also somewhat broken. Even more when combined with other arbitrary rules like fixed <strong>years of experience</strong>.</p><p>Our <strong>societies are at risk</strong>. If we don’t change those things, if we don’t do it fast enough, an increasing wave of poverty with more power and economic imbalances could come and do a lot of damage.</p><p><strong>We can fix it</strong>.</p><p>And the <strong>future of education and art go together</strong>, at least I think they should.</p><p>I normally don’t think that whatever I have to say could be important. But people keep asking me for my opinions on random things and making me feel like my ideas are sometimes useful.</p><p>This is one of those opinions I have, normally not so public, and probably controversial. But maybe others could find it useful too, so I’m writing this.</p><p>This is just my point of view. It doesn’t represent anyone. And I can also change, adjust, and clarify my own point of view later. But maybe some of these ideas can be useful to others.</p><h3>The Traditional Education System</h3><p>The traditional education system seems designed to train people to adapt and work in the <strong>industrial revolution</strong>, in factories and such. That was a long time ago. Go early, dress the same, sit at a table, have a supervisor (teacher) in front of many people, do the same things as others do, etc.</p><p>People are given lots and <strong>lots of information</strong>, in most cases, <strong>without much purpose</strong>. I think most things learned and taught should have some purpose, otherwise, it’s very difficult to learn them, and even more to apply that knowledge. But anyway, that could be a longer post.</p><p>Good institutions normally can afford very good teachers.</p><p>But these institutions and those teachers are located at <strong>a specific point on earth</strong>.</p><p>Only the people around that point or the ones that can afford to go there can apply to study there. And not many are received because the space, teachers, and resources are limited.</p><p>This ends up being some intrinsic <strong>gatekeeping</strong>. It’s unavoidable, you just can’t fit more people in that room of the building.</p><p>But it’s not because the knowledge, the teacher’s style, or resources should be limited to some people. It’s just because the structure doesn’t allow more people to access it.</p><p>And then, in that same structure, the teacher has to teach the same class year after year. Even if it’s the same information, even if nothing new is added or changed, there’s a lot of <strong>repetition</strong> there.</p><p>Many institutions around the world teach the same concepts and ideas. But they end up coming up with different ways to do it. Some can afford to make <strong>better materials</strong>, methods, or ways of teaching, but that’s kept for only that institution. Other institutions keep teaching with the same old resources as before, even if that was improved somewhere else.</p><p>And in some cases, well regarded institutions <strong>teach old or somewhat obsolete concepts</strong>, or even privative tools that students will never be able to use again (e.g. a programming language), maybe just because the institution had a partnership with the provider of that tool. And when they go out and start working, they end up having to use a completely different tool, and learn it on the job either way.</p><p>But still, people end up wanting to move to some city to be able to attend some school, but the information or learnings are not necessarily tied to that particular place, it’s just a coincidence.</p><p>Of course, some professions need hands-on physical experience, like doctors. But many professions are not tightly coupled to space, even more in technology and digital professions.</p><h3>Education Systems Change Lives</h3><p>I’m not against education, and I know from very close people that these same education systems can and do <strong>change lives</strong>.</p><p>When people have the choice of studying there or nothing, the difference of education is massive.</p><p>And many have the fortune of having <strong>amazing teachers</strong> that guide them and help them a lot. But that’s not always the case.</p><p>And in some cases, some teachers that want and try to be better, to give practical materials and learnings, are stopped by the mechanics of what is expected in the system, maybe even just because it’s easier for the administration to keep things unchanged.</p><p>And also, some education institutions are doing a big job to <strong>evolve and adapt</strong> to these new changes, which is amazing.</p><p>I’m not against any of that. I only think education systems need to <strong>evolve faster</strong>, we need to evolve faster, and find ways to teach faster, better, in more accessible ways, more dynamically. Because technology and advancements are not gonna wait for us.</p><h3>Technology and Job Displacement</h3><p>Technology has <strong>displaced jobs</strong> for a long time. Old jobs disappear, slowly or quickly, and new jobs appear. Old jobs are replaced by new ones.</p><p>The education systems have to (at least should) adapt to this and teach the new jobs and skills.</p><p>But <strong>technology moves fast</strong>, and maybe gets faster every time. And the speed at which technology can alter jobs is higher than the speed at which traditional education systems evolve.</p><p>Traditional education systems tend to expect very long periods dedicated only to <strong>absorbing new knowledge</strong>, in many cases without much purpose or practice, and then they expect that the next periods will be dedicated to <strong>applying that information</strong>, without acquiring new information.</p><p>But jobs related to or altered by technology make that idea of one long period studying and one long period applying <strong>less practical</strong>.</p><p>Technology changes fast. If you start studying something expecting it to take as long as traditional education systems would, by the time you finish, there might be some better alternative. But if you expect not to study again, you will not learn that new better alternative.</p><p>I think it would make more sense to <strong>start studying and start applying as soon as possible</strong>. And <strong>continue studying and applying forever</strong>.</p><p>Arguments against technology (e.g. against artificial intelligence) tend to include <strong>job displacement</strong>. Because for people that have been doing the same thing for a long time and don’t have a way to change, update, and adapt, getting their job displaced would be very, very bad.</p><p>It’s common to hear “artificial intelligence will replace us” and alarming things like that. But it’s not a new problem, artificial intelligence is just technology, and technology has been changing jobs forever.</p><p>Technology, artificial intelligence, automation, are all <strong>just tools</strong>.</p><p>It’s like a <strong>knife</strong>. The first knife used to cut meat was probably very useful, it was probably a great invention, and it surely helped people a lot then. But then a knife could also be used to kill someone. That doesn’t necessarily make the knife itself a bad thing that should be avoided, but people should be taught to use it so that more people use it to improve lives than to damage them.</p><p>With technology we could have the power to reduce the need for many hazardous jobs, repetitive and sometimes numbing jobs, increase efficiency for everyone. We could have the power to increase the food production, reduce waste, reduce energy consumption. <strong>We could solve a lot of the problems in our society and our planet</strong>.</p><p>But then, we come again to the <strong>current traditional education systems</strong>. If people can’t access the information easily, if it’s not made to be easy to understand and apply, if it doesn’t have a clear purpose, then we don’t have a way to help people overcome the challenges from fast-moving technology, to get new jobs, to take advantage of those improvements in technology.</p><p>And powers or billionaires with more control of technology would just get more power, money and control of technology, while the rest gets less, and the imbalance would just get worse.</p><p>It could go really bad, or <strong>we could make it go well for everyone</strong>.</p><h3>Job Requirements</h3><p>The next problem is the traditional system of how to <strong>hire people</strong>.</p><p>The process of hiring people tends to be based on <strong>very old practices</strong> and assumptions that look more and more <strong>absurd</strong> every day.</p><p>If a job requires someone to have a <strong>specific degree</strong> from the traditional education system, that is gatekeeping and discarding anyone that could have learned the same things or more with alternative methods (it would certainly discard me, I was homeschooled all my life 😅).</p><p>And knowing that traditional education systems are not being able to evolve and adapt as fast as technology, <strong>why require someone to have a specific degree</strong> when that is not really what they are gonna need in the job that is evolving at the pace of technology?</p><p>A possible answer is that “they will have the foundational concepts of some topic”. But then, the requirement tends to be “a degree on X”. But they never examine the topics taught in that “X” career at that particular institution to see if the person knows what is needed. So, why such a strong emphasis in a particular degree when there’s really no emphasis in the actual knowledge or skills?</p><h3>Years of Experience</h3><p>The same goes for <strong>years of experience</strong>. It’s just a proxy indicator for the skill level, but it’s quite a bad proxy.</p><p>One person could have 5 or 10 years working with the same technology, doing the same thing day after day. They are not learning something new. And the skills they can get in all that time are not that much.</p><p>And another person could work at a startup where they have to learn new skills very fast, take new challenges, even lead colleagues, and <strong>acquire the equivalent of “10 years of experience” in 2 years</strong>.</p><p>Companies and job requirements have to be updated as well, to check for actual skills, or at least better indicators of those skills, but <strong>less focus on particular degrees and years of experience</strong>.</p><h3>Governments</h3><p>Of course, <strong>governments could be greatly improved</strong> as well. They tend to be the slowest of them all. And many things in governments are still very tightly coupled to traditional and old structures, like traditional education system degrees.</p><p>But they are probably the most difficult to change, and they will slowly evolve and will have to adapt by force at some point.</p><p>That’s a much harder battle, we can focus on the things we can have an impact on right now, because we need changes to start right now.</p><h3>New Education Systems</h3><p>Coming back to education systems, there are new ones. I’m not inventing anything new here.</p><p>The internet with articles, documentation, videos, massive online courses (in many cases free), provide a way for people around the world to access the <em>same</em> content.</p><p>Grading can be done by software online. It’s not great, but it’s not much worse than tests in traditional education systems, and can be scaled much better. The topic of how to test knowledge and the emphasis on those standard educational tests can also be greatly improved, but that’s also another longer post.</p><p>Having these resources available online helps a lot with the limit and gatekeeping of <strong>physical locations</strong> on the planet. It also helps with the problem of the limited seats in a given room in a building, <strong>the internet doesn’t have limited seats</strong>.</p><p>Even if the classes are done by the same teachers in person, accessing and using the best materials available online can help a lot.</p><p>But one of the best parts of all this is that the best teachers in the world can give their <strong>best classes once</strong>, record them, and everyone else could access them. No need for the teachers to work a lot more, at least it’s not proportional to the people that can access the information.</p><p>And the <strong>content can be improved and refined</strong> by them or even others through time. And the best ways to teach something can emerge and improve. So that people can learn things easily and fast, with <strong>less effort</strong> every time, so more people can learn those things instead of having knowledge reserved only for the most brilliant ones that were able to disentangle a concept from a dry explanation.</p><p>Someone that I admire highly seems to have realized all this, several years ago. Andrew Ng was pioneering work on artificial intelligence, he probably realized all the job displacement it could produce, and that the next problem to solve for society would be <strong>education</strong> and access to it. And he went and founded Coursera, an online education platform. I think something similar happened with Sebastian Thrun and Udacity. And probably the same with edX and others too.</p><h3>How New Education Systems Affected Me</h3><p>I’m from Colombia, in Latin America, I was homeschooled all my life, I didn’t go to the university (actually, not even to elementary school).</p><p>But I had access (sometimes) to the internet. I was able to download pages to read them offline and learn about web technologies and Wikipedia articles.</p><p>And then when these online course platforms appeared (and I had stable internet), I was able to take <strong>the same classes</strong> from well regarded institutions that people in those places could, otherwise I wouldn’t have been able to.</p><p>Here’s an interesting detail. I’m <strong>not against education institutions</strong>, not even traditional ones. The courses I studied were mostly from those same institutions. I’m <strong>against the system that enforces limiting information</strong> to only a few people based on arbitrary coincidences (location) and rules, and implicitly gatekeeping that knowledge, when making it more accessible doesn’t cost more and can help much more people.</p><p>In those courses, I was taught <strong>very complex concepts</strong> in ways that seemed easy to understand, at times, I even felt smart. I studied the same concepts in different courses, through time, I also saw other people studying and struggling with the same concepts. And that let me see that <strong>some ways of teaching</strong> something were <strong>better than others</strong>.</p><p>In some cases the same concept <strong>could look much more complex</strong> than when it was explained differently. <strong>The way something is explained makes a huge difference</strong>. It makes a difference in the effort needed to learn, the speed at which something can be learned, the ability to apply the concept afterwards, etc.</p><h3>Art for Education</h3><p>At this point is where <strong>art</strong> comes in.</p><p>A possible definition of art would be <strong>the use of aesthetics and imagination to communicate something</strong>.</p><p>It doesn’t have to be jolly, you could have a very <strong>sad and powerful</strong> piece of art, a song, a painting. But it uses aesthetics, sensation, imagination, to communicate that sadness, in a powerful way.</p><p>And education could be thought of as <strong>the communication of useful information that can be applied</strong>.</p><p>Now, art would be one of the most powerful ways to communicate in general (if not just <strong>the most</strong>). And education is also just about communicating things well.</p><p>I think <strong>art could and should be used for education</strong>, to teach things.</p><p>How many things are taught to kids via songs? Have you heard the saying that “an image is worth a thousand words”?</p><p><strong>Art can accelerate the process of learning, teaching, and education in general.</strong></p><p>And now that each piece of content, each resource that is made, can be available online to millions of people around the world, making art to explain a concept is worth it.</p><p>Before, making a piece of art, only for one class, given to a handful of people, might not have been worth the effort, time, money. But if you can teach millions, then all that investment is paid through time and people learning.</p><p>There’s also the strange coincidence that artists tend to have a difficult life, it’s not easy to succeed, at least economically, via arts. It tends to work for only a handful that takes all the fame and money, but it doesn’t work well for the great majority.</p><p>But if art is not dedicated only to entertainment, but also to <strong>education</strong>, there are a lot of artists that can help, that resource is already there, and doing that would of course help them. Some hundreds or thousands of dollars for some work that can then be replicated for a long time with many people can be very fruitful for the thing that is being taught.</p><p>I imagine courses (mostly online, because those are the ones I’ve taken the most), where from time to time, they hire a new illustrator, or 3D artist, that makes a new animation that explains the concept that otherwise you would have to just imagine. Or musicians making short songs to teach ideas or things that need to be memorized, etc.</p><p>As part of that, there are also many things that improve how something is taught that I would consider art, or somewhat art at least. Like cues that bring the attention to some content (e.g. some piece of code), diagrams, tools to show and order the information in a beautiful way, etc. Making sure that something is amenable, that it’s <strong>easy to communicate</strong>, is what ensures that it will have the intended effect, that it will be useful.</p><h3>A Small Example Using Art for Education</h3><p>I have been playing with this idea for a while. Some years ago I took an online course on edX from Berkeley about artificial intelligence. It was <strong>full of illustrations</strong>, and it was clear that the illustrator knew the concepts, because the illustrations were clear, easy to understand, and accurate. I didn’t even think the course was “easy”, but it was so <strong>enjoyable</strong>! And the concepts were much <strong>easier and fast to understand</strong>. I studied some of the same concepts in several other courses, and the difference between having clear illustrations and art and having only dense descriptions was really big.</p><p>I wanted to have <strong>illustrations for FastAPI</strong> too. So I recently hired <a href="https://ketrinadrawsalot.tumblr.com/">Ketrina</a>, the same illustrator from that edX course, and asked her to make some illustrations for the documentation explaining <a href="https://fastapi.tiangolo.com/async/#concurrency-and-burgers">concurrency and parallelism in FastAPI</a>.</p><p>I would hope that would help explain the concepts better and more easily.</p><p>I would hope others could also start using art to explain things better, more easily, and faster, and hopefully, <strong>making education more accessible to everyone</strong>.</p><h3>About me</h3><p>Hey! 👋 I’m Sebastián Ramírez (<a href="https://tiangolo.com/">tiangolo</a>).</p><p>You can follow me, contact me, see what I do, or use my open source code:</p><ul><li><a href="https://github.com/tiangolo">GitHub: tiangolo</a></li><li><a href="https://twitter.com/tiangolo">Twitter: tiangolo</a></li><li><a href="https://www.linkedin.com/in/tiangolo/">LinkedIn: tiangolo</a></li><li><a href="https://dev.to/tiangolo">Dev: tiangolo.to</a></li><li><a href="https://tiangolo.medium.com/">Medium: tiangolo</a></li><li><a href="https://tiangolo.com/">Web: tiangolo.com</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=e515f664b778" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[HTTPS for Developers]]></title>
            <link>https://tiangolo.medium.com/https-for-developers-5e42dcf7d4db?source=rss-963974981597------2</link>
            <guid isPermaLink="false">https://medium.com/p/5e42dcf7d4db</guid>
            <category><![CDATA[fastapi]]></category>
            <category><![CDATA[https]]></category>
            <category><![CDATA[devops]]></category>
            <dc:creator><![CDATA[Sebastián Ramírez]]></dc:creator>
            <pubDate>Tue, 28 Sep 2021 08:32:35 GMT</pubDate>
            <atom:updated>2021-09-28T08:34:08.570Z</atom:updated>
            <content:encoded><![CDATA[<p>This article lives in:</p><ul><li><a href="https://dev.to/tiangolo/https-for-developers-1774">Dev.to</a></li><li><a href="https://tiangolo.medium.com/https-for-developers-5e42dcf7d4db">Medium</a></li><li><a href="https://github.com/tiangolo/blog-posts/blob/master/https-for-developers/README.md">GitHub</a></li><li><a href="https://fastapi.tiangolo.com/deployment/https/">The FastAPI docs</a></li></ul><h3>Intro</h3><p>Here’s a brief introduction to <strong>HTTPS for developers</strong>. 🔒</p><p>This article is extracted from the <a href="https://fastapi.tiangolo.com/deployment/https/">FastAPI docs about HTTPS</a>.</p><p>I just upgraded those docs with several explanations and diagrams, and I thought the end result is <strong>generic</strong> and <strong>useful</strong> enough for many other developers (even in other <strong>languages</strong> and <strong>frameworks</strong>) to also publish it as a post, so here it is. 🤓</p><h3>Who Is This For</h3><p>If you are a <strong>user</strong>, your only interaction with HTTPS is with the <strong>browser</strong> opening URLs, then you are better off just reading <a href="https://howhttps.works/">How HTTPS Works</a>.</p><p>If you are a <strong>cryptography researcher</strong>, you are better off studying the cryptographic primitives and then reading the standards (RFCs).</p><p>But if you are a <strong>developer</strong> (programmer, coder) and want to know <strong>enough technical details</strong> to understand how it works and <strong>how to use HTTPS</strong> in your applications without going into the depths of cryptography and web standards, then this is for you! 🎉👇</p><h3>About HTTPS</h3><p>It is easy to assume that HTTPS is something that is just “enabled” or not.</p><p>But it is way more complex than that.</p><p>To <strong>learn the basics of HTTPS</strong>, from a consumer perspective, check <a href="https://howhttps.works/">https://howhttps.works/</a>.</p><p>Now, from a <strong>developer’s perspective</strong>, here are several things to have in mind while thinking about HTTPS:</p><ul><li>For HTTPS, <strong>the server</strong> needs to <strong>have “certificates”</strong> generated by a <strong>third party</strong>.</li><li>Those certificates are actually <strong>acquired</strong> from the third party, not “generated”.</li><li>Certificates have a <strong>lifetime</strong>.</li><li>They <strong>expire</strong>.</li><li>And then they need to be <strong>renewed</strong>, <strong>acquired again</strong> from the third party.</li><li>The encryption of the connection happens at the <strong>TCP level</strong>.</li><li>That’s one layer <strong>below HTTP</strong>.</li><li>So, the <strong>certificate and encryption</strong> handling is done <strong>before HTTP</strong>.</li><li><strong>TCP doesn’t know about “domains”</strong>. Only about IP addresses.</li><li>The information about the <strong>specific domain</strong> requested goes in the <strong>HTTP data</strong>.</li><li>The <strong>HTTPS certificates</strong> “certify” a <strong>certain domain</strong>, but the protocol and encryption happen at the TCP level, <strong>before knowing</strong> which domain is being dealt with.</li><li><strong>By default</strong>, that would mean that you can only have <strong>one HTTPS certificate per IP address</strong>.</li><li>No matter how big your server is or how small each application you have on it might be.</li><li>There is a <strong>solution</strong> to this, however.</li><li>There’s an <strong>extension</strong> to the <strong>TLS</strong> protocol (the one handling the encryption at the TCP level, before HTTP) called <a href="https://en.wikipedia.org/wiki/Server_Name_Indication"><strong>SNI</strong></a>.</li><li>This SNI extension allows one single server (with a <strong>single IP address</strong>) to have <strong>several HTTPS certificates</strong> and serve <strong>multiple HTTPS domains/applications</strong>.</li><li>For this to work, a <strong>single</strong> component (program) running on the server, listening on the <strong>public IP address</strong>, must have <strong>all the HTTPS certificates</strong> in the server.</li><li><strong>After</strong> obtaining a secure connection, the communication protocol is <strong>still HTTP</strong>.</li><li>The contents are <strong>encrypted</strong>, even though they are being sent with the <strong>HTTP protocol</strong>.</li></ul><p>It is a common practice to have <strong>one program/HTTP server</strong> running on the server (the machine, host, etc.) and <strong>managing all the HTTPS parts</strong>: receiving the <strong>encrypted HTTPS requests</strong>, sending the <strong>decrypted HTTP requests</strong> to the actual HTTP application running in the same server (the <strong>FastAPI</strong> application, in this case), take the <strong>HTTP response</strong> from the application, <strong>encrypt it</strong> using the appropriate <strong>HTTPS certificate</strong> and sending it back to the client using <strong>HTTPS</strong>. This server is often called a <a href="https://en.wikipedia.org/wiki/TLS_termination_proxy"><strong>TLS Termination Proxy</strong></a>.</p><p>Some of the options you could use as a TLS Termination Proxy are:</p><ul><li>Traefik (that can also handle certificate renewals)</li><li>Caddy (that can also handle certificate renewals)</li><li>Nginx</li><li>HAProxy</li></ul><h3>Let’s Encrypt</h3><p>Before Let’s Encrypt, these <strong>HTTPS certificates</strong> were sold by trusted third parties.</p><p>The process to acquire one of these certificates used to be cumbersome, require quite some paperwork and the certificates were quite expensive.</p><p>But then <a href="https://letsencrypt.org/"><strong>Let’s Encrypt</strong></a> was created.</p><p>It is a project from the Linux Foundation. It provides <strong>HTTPS certificates for free</strong>, in an automated way. These certificates use all the standard cryptographic security, and are short-lived (about 3 months), so the <strong>security is actually better</strong> because of their reduced lifespan.</p><p>The domains are securely verified and the certificates are generated automatically. This also allows automating the renewal of these certificates.</p><p>The idea is to automate the acquisition and renewal of these certificates so that you can have <strong>secure HTTPS, for free, forever</strong>.</p><h3>HTTPS for Developers by Example</h3><p>Here’s an example of how an HTTPS API could look like, step by step, paying attention mainly to the ideas important for developers.</p><h3>Domain Name</h3><p>It would probably all start by you <strong>acquiring</strong> some <strong>domain name</strong>. Then, you would configure it in a DNS server (possibly your same cloud provider).</p><p>You would probably get a cloud server (a virtual machine) or something similar, and it would have a fixed <strong>public IP address</strong>.</p><p>In the DNS server(s) you would configure a record (an “A record&quot;) to point <strong>your domain</strong> to the public <strong>IP address of your server</strong>.</p><p>You would probably do this just once, the first time, when setting everything up.</p><p><strong>Tip</strong>: This Domain Name part is way before HTTPS, but as everything depends on the domain and the IP address, it’s worth mentioning it here.</p><h3>DNS</h3><p>Now let’s focus on all the actual HTTPS parts.</p><p>First, the browser would check with the <strong>DNS servers</strong> what is the <strong>IP for the domain</strong>, in this case, someapp.example.com.</p><p>The DNS servers would tell the browser to use some specific <strong>IP address</strong>. That would be the public IP address used by your server, that you configured in the DNS servers.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/652/0*QhI7o0PMd20Y-R_q.png" /></figure><h3>TLS Handshake Start</h3><p>The browser would then communicate with that IP address on <strong>port 443</strong> (the HTTPS port).</p><p>The first part of the communication is just to establish the connection between the client and the server and to decide the cryptographic keys they will use, etc.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/880/0*WnY-lLu5KZ65Mzaq.png" /></figure><p>This interaction between the client and the server to establish the TLS connection is called the <strong>TLS handshake</strong>.</p><h3>TLS with SNI Extension</h3><p><strong>Only one process</strong> in the server can be listening on a specific <strong>port</strong> in a specific <strong>IP address</strong>. There could be other processes listening on other ports in the same IP address, but only one for each combination of IP address and port.</p><p>TLS (HTTPS) uses the specific port 443 by default. So that&#39;s the port we would need.</p><p>As only one process can be listening on this port, the process that would do it would be the <strong>TLS Termination Proxy</strong>.</p><p>The TLS Termination Proxy would have access to one or more <strong>TLS certificates</strong> (HTTPS certificates).</p><p>Using the <strong>SNI extension</strong> discussed above, the TLS Termination Proxy would check which of the TLS (HTTPS) certificates available it should use for this connection, using the one that matches the domain expected by the client.</p><p>In this case, it would use the certificate for someapp.example.com.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/880/0*m1WGuTWb_7JlaDRs.png" /></figure><p>The client already <strong>trusts</strong> the entity that generated that TLS certificate (in this case Let’s Encrypt, but we’ll see about that later), so it can <strong>verify</strong> that the certificate is valid.</p><p>Then, using the certificate, the client and the TLS Termination Proxy <strong>decide how to encrypt</strong> the rest of the <strong>TCP communication</strong>. This completes the <strong>TLS Handshake</strong> part.</p><p>After this, the client and the server have an <strong>encrypted TCP connection</strong>, this is what TLS provides. And then they can use that connection to start the actual <strong>HTTP communication</strong>.</p><p>And that’s what <strong>HTTPS</strong> is, it’s just plain <strong>HTTP</strong> inside a <strong>secure TLS connection</strong> instead of a pure (unencrypted) TCP connection.</p><p><strong>Tip</strong>: Notice that the encryption of the communication happens at the <strong>TCP level</strong>, not at the HTTP level.</p><h3>HTTPS Request</h3><p>Now that the client and server (specifically the browser and the TLS Termination Proxy) have an <strong>encrypted TCP connection</strong>, they can start the <strong>HTTP communication</strong>.</p><p>So, the client sends an <strong>HTTPS request</strong>. This is just an HTTP request through an encrypted TLS connection.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/880/0*wquTXVI_31a0KPid.png" /></figure><h3>Decrypt the Request</h3><p>The TLS Termination Proxy would use the encryption agreed to <strong>decrypt the request</strong>, and would transmit the <strong>plain (decrypted) HTTP request</strong> to the process running the application (for example a process with Uvicorn running the FastAPI application).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/880/0*pvJlrvCCb-xEiuR8.png" /></figure><h3>HTTP Response</h3><p>The application would process the request and send a <strong>plain (unencrypted) HTTP response</strong> to the TLS Termination Proxy.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/880/0*ZshnpYSoWnceKRhh.png" /></figure><h3>HTTPS Response</h3><p>The TLS Termination Proxy would then <strong>encrypt the response</strong> using the cryptography agreed before (that started with the certificate for someapp.example.com), and send it back to the browser.</p><p>Next, the browser would verify that the response is valid and encrypted with the right cryptographic key, etc. It would then <strong>decrypt the response</strong> and process it.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/880/0*1Ss-P0_WgEKmFV1Z.png" /></figure><p>The client (browser) will know that the response comes from the correct server because it is using the cryptography they agreed using the <strong>HTTPS certificate</strong> before.</p><h3>Multiple Applications</h3><p>In the same server (or servers), there could be <strong>multiple applications</strong>, for example, other API programs or a database.</p><p>Only one process can be handling the specific IP and port (the TLS Termination Proxy in our example) but the other applications/processes can be running on the server(s) too, as long as they don’t try to use the same <strong>combination of public IP and port</strong>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/880/0*9FwyZ4z_JXDW8-nR.png" /></figure><p>That way, the TLS Termination Proxy could handle HTTPS and certificates for <strong>multiple domains</strong>, for multiple applications, and then transmit the requests to the right application in each case.</p><h3>Certificate Renewal</h3><p>At some point in the future, each certificate would <strong>expire</strong> (about 3 months after acquiring it).</p><p>And then, there would be another program (in some cases it’s another program, in some cases it could be the same TLS Termination Proxy) that would talk to Let’s Encrypt, and renew the certificate(s).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/880/0*9QpJitoQwT7XAKKV.png" /></figure><p>The <strong>TLS certificates</strong> are <strong>associated with a domain name</strong>, not with an IP address.</p><p>So, to renew the certificates, the renewal program needs to <strong>prove</strong> to the authority (Let’s Encrypt) that it indeed <strong>“owns” and controls that domain</strong>.</p><p>To do that, and to accommodate different application needs, there are several ways it can do it. Some popular ways are:</p><ul><li><strong>Modify some DNS records</strong>.</li><li>For this, the renewal program needs to support the APIs of the DNS provider, so, depending on the DNS provider you are using, this might or might not be an option.</li><li><strong>Run as a server</strong> (at least during the certificate acquisition process) on the public IP address associated with the domain.</li><li>As we said above, only one process can be listening on a specific IP and port.</li><li>This is one of the reasons why it’s very useful when the same TLS Termination Proxy also takes care of the certificate renewal process.</li><li>Otherwise, you might have to stop the TLS Termination Proxy momentarily, start the renewal program to acquire the certificates, then configure them with the TLS Termination Proxy, and then restart the TLS Termination Proxy. This is not ideal, as your app(s) will not be available during the time that the TLS Termination Proxy is off.</li></ul><p>All this renewal process, while still serving the app, is one of the main reasons why you would want to have a <strong>separate system to handle HTTPS</strong> with a TLS Termination Proxy instead of just using the TLS certificates with the application server directly (e.g. Uvicorn).</p><h3>Recap</h3><p>Having <strong>HTTPS</strong> is very important, and quite <strong>critical</strong> in most cases. Most of the effort you as a developer have to put around HTTPS is just about <strong>understanding these concepts</strong> and how they work.</p><p>But once you know the basic information of <strong>HTTPS for developers</strong> you can easily combine and configure different tools to help you manage everything in a simple way.</p><h3>Learn More</h3><p>This article is extracted from the <a href="https://fastapi.tiangolo.com/deployment/https/">FastAPI documentation about deployments and HTTPS</a>.</p><p>If you want to learn concrete examples of some tools you can use and how to configure them to deploy a FastAPI application, check out the next chapters in the <a href="https://fastapi.tiangolo.com/deployment/">FastAPI documentation</a>.</p><h3>About me</h3><p>Hey! 👋 I’m Sebastián Ramírez (<a href="https://tiangolo.com/">tiangolo</a>).</p><p>You can follow me, contact me, see what I do, or use my open source code:</p><ul><li><a href="https://github.com/tiangolo">GitHub: tiangolo</a></li><li><a href="https://twitter.com/tiangolo">Twitter: tiangolo</a></li><li><a href="https://www.linkedin.com/in/tiangolo/">LinkedIn: tiangolo</a></li><li><a href="https://dev.to/tiangolo">Dev: tiangolo.to</a></li><li><a href="https://tiangolo.medium.com/">Medium: tiangolo</a></li><li><a href="https://tiangolo.com/">Web: tiangolo.com</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=5e42dcf7d4db" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[The Future of FastAPI and Pydantic is Bright]]></title>
            <link>https://tiangolo.medium.com/the-future-of-fastapi-and-pydantic-is-bright-2d1785a603a9?source=rss-963974981597------2</link>
            <guid isPermaLink="false">https://medium.com/p/2d1785a603a9</guid>
            <category><![CDATA[fastapi]]></category>
            <category><![CDATA[python]]></category>
            <category><![CDATA[pydantic]]></category>
            <dc:creator><![CDATA[Sebastián Ramírez]]></dc:creator>
            <pubDate>Sat, 19 Jun 2021 14:28:51 GMT</pubDate>
            <atom:updated>2021-06-19T14:30:25.678Z</atom:updated>
            <content:encoded><![CDATA[<p>This article lives in:</p><ul><li><a href="https://dev.to/tiangolo/the-future-of-fastapi-and-pydantic-is-bright-3pbm">Dev.to</a></li><li><a href="https://tiangolo.medium.com/the-future-of-fastapi-and-pydantic-is-bright-2d1785a603a9">Medium</a></li><li><a href="https://github.com/tiangolo/blog-posts/blob/master/the-future-of-fastapi-and-pydantic-is-bright/README.md">GitHub</a></li></ul><h3>In very short</h3><p>The future of <a href="https://fastapi.tiangolo.com/">FastAPI</a> and <a href="https://pydantic-docs.helpmanual.io/">Pydantic</a> is bright. ✨</p><p>This is because we all, as the Python community, define their future. To help us and to help others. From the Core Developers making Python itself to the new developers that started learning Python this month.</p><p>And as long as these tools are helping us all solve problems, help ourselves, help others, and be more efficient and productive, we all will keep them working and improving.</p><p>And that’s what we are all doing. 🤓🚀</p><h3>Intro</h3><p>You might have heard not long ago about PEP 563, PEP 649, and some changes that could affect Pydantic and FastAPI in the future.</p><p>If you read about it, I wouldn’t expect you to understand what all that meant. I didn’t fully understand it until I spent hours reading all the related content and doing multiple experiments.</p><p>It might have worried you and maybe confuse you a bit.</p><p>Now there’s nothing to be worried about. But still, here I want to help clarify all that and give you a bit more context.</p><p>Brace yourself, you are about to learn a bit more about how Python works, how FastAPI and Pydantic work, how type annotations work, and more. 👇</p><h3>Details</h3><h3>Start with a basic FastAPI app</h3><p>FastAPI is based on Pydantic. Let’s see a simple example using them both.</p><p>Imagine that we have a file ./main.py with the following code:</p><pre>from typing import Optional</pre><pre>import uvicorn<br>from fastapi import FastAPI<br>from pydantic import BaseModel<br></pre><pre>class Item(BaseModel):<br>    name: str<br>    description: Optional[str] = None<br>    price: float<br></pre><pre>app = FastAPI()</pre><pre>@app.post(&quot;/items/&quot;)<br>def create_item(item: Item):<br>    return item<br></pre><pre>if __name__ == &quot;__main__&quot;:<br>    uvicorn.run(app)</pre><p>You could run this example and start the API application with:</p><pre>$ python ./main.py</pre><pre>INFO:     Started server process [4418]<br>INFO:     Waiting for application startup.<br>INFO:     Application startup complete.<br>INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)</pre><p>Then you could open your browser and interact with the API docs at http://127.0.0.1:8000/docs, etc.</p><p>But here we want to focus on what happens behind the scenes.</p><p><strong>Note</strong>: Instead of using the last two lines, you could have used the uvicorn command, and that&#39;s what you would normally do. But for this example, it will be useful to see everything from the point of view of the python command.</p><h3>How Python works</h3><p>By running that command above, you are asking your system to start the program called python. And to give it the file main.py as a parameter.</p><p><strong>Note</strong>: In Windows, the program might be called python.exe instead of just python.</p><p>That program called python (or python.exe) is written in another programming language called &quot;C&quot;. Maybe you knew that.</p><p>And what that program python does is read the file main.py, interpret the code that we wrote in it using the <strong>Python</strong> Programming Language, and execute it step by step.</p><p>So, we have two things with more or less the same name “python” that represent something slightly different:</p><ul><li>python: the program that runs our code (which is actually written in the <strong>C</strong> programming language)</li><li>“Python”: the name of the programming language we use to write our code</li></ul><p>So, you could say that python (the program) can read <strong>Python</strong> (the programming language).</p><h3>What is Runtime</h3><p>Now, when that program python is executing our code written in the <strong>Python</strong> programming language, we call that &quot;runtime&quot;.</p><p>It’s just the period of time when it is executing our code.</p><p>When our code is not being executed, for example, when we are editing the file ./main.py, it is not <em>running</em>, so we are not at <strong>runtime</strong>.</p><p>The way that program works is that, at runtime (when our code is being executed), Pydantic and FastAPI read those <strong>type annotations</strong> (or <strong>type hints</strong>) to extract their data and do things with it.</p><p>So, for example, in the Item class above, we have:</p><pre>class Item(BaseModel):<br>    name: str<br>    description: Optional[str] = None<br>    price: float</pre><p>At <strong>runtime</strong>, Pydantic and FastAPI will see that name is a str and price is a float. And if we send a JSON request with a price that is not a float, they will be able to validate the data for us.</p><p>FastAPI and Pydantic are written in pure <strong>Python</strong>. How can these tools do that? <strong>Python</strong> is so powerful that it has features to allow exactly that, to read type annotations at runtime from the same <strong>Python</strong> code. And Pydantic and FastAPI take advantage of those features.</p><p>Another term commonly used to refer to doing things at <strong>runtime</strong> is to do things <strong>dynamically</strong>.</p><h3>What is Static Analysis</h3><p>The counterpart of <strong>runtime</strong> would be <strong>static</strong>. It just means that the code is not being executed. It’s treated just as a text file containing code.</p><p>In many cases, “static” is used when saying <strong>Static Analysis</strong>, <strong>Static Checking</strong>, <strong>Static Type Checking</strong>, etc. It refers to tools that understand the rules of the <strong>Python</strong> Programming Language and that can analyze the code, but that doesn’t execute the code itself.</p><p>These tools for <strong>static analysis</strong> can check if the code is following the rules correctly, checking that the code is valid, providing autocompletion, and other features. When you are editing code and your editor shows a squiggly red line with an error somewhere, that is <strong>static analysis</strong>.</p><p>In some cases, the code could be valid, but it would still be incorrect. For example, if you try to add a str and a float together:</p><pre>name = &quot;Rick&quot;<br>price = 1.99</pre><pre>total = name + price</pre><p>In terms of the rules of the language itself, the code is valid, all the quotes are where they should be, the equal signs are correctly placed, etc. But this code is still incorrect and will not work because you can’t add a str with a float.</p><p>Many editors will be able to show you a very valuable squiggly red line with the error message under name + price that might save you hours debugging. That is also <strong>static analysis</strong>.</p><p>Some tools that do <strong>static analysis</strong> and that you might have heard of are:</p><ul><li><strong>mypy</strong>, the official and main <strong>Static Type Checker</strong></li><li><strong>flake8</strong>, checks for style and correctness</li><li><strong>black</strong>, autoformats the code in a consistent way that improves efficiency</li><li><strong>PyCharm</strong>, one of the most popular Python editors, has internal components that do <strong>static analysis</strong> to check for errors, provide autocompletion, etc.</li><li><strong>VS Code</strong>, the other of the most popular Python editors, using Pylance, also has internal tools to do <strong>static analysis</strong> to check for errors, provide autocompletion, etc.</li></ul><p>These tools have saved tons of development hours by detecting many bugs earlier in the development process and in the exact place where those errors happened. I bet that in many cases you might have seen the red line, realize what the error is, think “ah, yeah, right”, fix it, and not even consider that <em>there was a bug in your code</em>, even for some seconds. If I counted all the times these tools have saved me from these bugs, I would get overwhelmed quickly. 😅</p><p>And if you have ever added type annotations to a code base that didn’t have them before, you probably would have seen lots of broken sections in the code base and broken corner cases, that were suddenly obvious and you could then fix them. I surely have.</p><h3>Type Annotations in Python</h3><p>The <strong>Type Annotations</strong> (also called <strong>Type Hints</strong>) that we have available in all the supported modern Python versions (Python 3.6 and above) were designed to improve all that <strong>static analysis</strong>.</p><p>The original intention was to allow mypy and others to help developers while <em>writing</em> the code. And that was the main focus for a while.</p><p>But then tools like dataclasses (from the standard library) and <a href="https://twitter.com/samuel_colvin">Samuel Colvin</a>&#39;s Pydantic started using these type annotations to do more than only <strong>static analysis</strong>, and to use these same <strong>type annotations</strong> at <strong>runtime</strong>. In the case of Pydantic, to extract that information to do data conversion, validation, and documentation.</p><h3>Type Annotations with Forward References</h3><p>Now, imagine we have a class (it could be a Pydantic model) like this:</p><pre>from typing import Optional</pre><pre>from pydantic import BaseModel<br></pre><pre>class Person(BaseModel):<br>    name: str<br>    child: Optional[Person] = None</pre><p>Here we have a Person that could have a child, that would also be a Person. It all looks fine, right?</p><p>But now when we run the code (or with the help of some static analysis in editors) we will see that we declared child: Optional[Person] inside the body of the class Person. So, when that part of the code is run by python, the Person inside of name: Optional[Person] doesn&#39;t exist yet (that class is still being created).</p><p>This is called a <strong>Forward Reference</strong>. And it would make the code break.</p><p>And again, the main purpose of these type annotations was to help with <strong>static analysis</strong>. Using them at runtime was not yet an important use case.</p><p>And having the code break just because we are trying to improve static analysis would be very annoying.</p><p>To overcome that problem, it’s also valid to declare that internal Person as a literal string, like this:</p><pre>from typing import Optional</pre><pre>from pydantic import BaseModel</pre><pre>class Person(BaseModel):<br>    name: str<br>    child: Optional[&quot;Person&quot;] = None</pre><p>That looked weird to me when I discovered it. It’s the name of a class just put there inside a string. But it’s valid.</p><p>When python is running, it will see that as a literal string, so it will not break.</p><p>And most static analysis tools know this is valid and will read the literal string and understand that it actually refers to the Person class.</p><p>By knowing that the Optional[&quot;Person&quot;] actually refers to the Person class, static analysis tools can, for example, detect that this would be an error:</p><pre>parent = Person(name=&quot;Beth&quot;)</pre><pre>parent.child = 3</pre><p>A smart editor will use its static analysis tools to detect that parent.child = 3 is an error because it expects a Person.</p><p>This solves the problem of the forward reference in the code and allows us to still use static analysis tools.</p><p>…we are not talking about using these type annotations at <strong>runtime</strong> yet, but we’ll get there later.</p><h3>PEPs in Python</h3><p>PEP stands for <strong>Python Enhancement Proposal</strong>. A PEP is a technical document describing changes to Python, additions to the standard library (for example, adding dataclasses), and other types of changes. Or in some cases, they just provide information and establish conventions.</p><p>The name says <strong>Proposal</strong>, but when they are finally accepted they become a standard.</p><h3>PEP 563 — Postponed Evaluation of Annotations</h3><p>Knowing what’s a PEP, let’s go back to the code example above.</p><p>If you hadn’t seen something like the Optional[&quot;Person&quot;] part before, you might have cringed a bit. I did the first time I discovered that was valid, but it was understandable as it would solve the problem.</p><p>Then <a href="https://twitter.com/llanga">Łukasz Langa</a> had a smart idea and wrote <a href="https://www.python.org/dev/peps/pep-0563/">PEP 563</a>.</p><p>If the way type annotations were interpreted changed, and if they were <em>implicitly</em> understood by Python as if they were all just strings, then we would not have to put all those classes inside strings in strange places in our code.</p><p>So, we would write our code like:</p><pre>from typing import Optional</pre><pre>from pydantic import BaseModel</pre><pre>class Person(BaseModel):<br>    name: str<br>    child: Optional[Person] = None</pre><p>And then whenever python read our file ./main.py it would see it as if it was written like this:</p><pre>from typing import Optional</pre><pre>from pydantic import BaseModel</pre><pre>class Person(BaseModel):<br>    name: &quot;str&quot;<br>    child: &quot;Optional[Person]&quot; = None</pre><p>So, python would run our code happily and without breaking.</p><p>And we, the developers would be much happier not having to remember where to put things inside strings and where not.</p><p>And we would be able to keep using autocompletion and type checks even in these type annotations with forward references. For example, triggering autocompletion inside a string, with the previous technique, might not always work, but with this change that wouldn’t be a problem anymore.</p><p>And in the case that some tool ended up using these type annotations at runtime for other reasons, there were still ways to get the information at runtime, with some <em>small caveats</em>, but it was still possible.</p><p><strong>Spoiler Alert</strong>: These <em>small caveats</em> are what later would become a cumbersome problem for Pydantic, but we’ll get there.</p><p><strong>Note</strong>: Have in mind that this was done several years ago, in fact, the same year Pydantic was released for the first time. Using type annotations at runtime for other purposes than static analysis was not a common use case if at all. It’s remarkable that it was even accounted for.</p><p>Now, as this would change the behavior of Python internally in a more or less drastic way, it would not be enforced by default yet. Instead, it was made available using a special import, from __future__ import annotations:</p><pre>from __future__ import annotations<br>from typing import Optional</pre><pre>from pydantic import BaseModel</pre><pre>class Person(BaseModel):<br>    name: str<br>    child: Optional[Person] = None</pre><p>And as now these type annotations were treated as just strings, it allowed some interesting tricks when using them only for static analysis, like using typing features from future versions of Python in previous versions.</p><p>For example, declaring Person | None instead of Optional[Person], avoiding the extra Optional and the extra import, even in Python 3.7 (that feature is available in Python 3.10 but not in Python 3.7):</p><pre>from __future__ import annotations</pre><pre>class Person:<br>    name: str<br>    child: Person | None = None</pre><p><strong>Note</strong>: Have in mind that this would only work for static analysis tools, your editor could understand that even in Python 3.7, but Pydantic wouldn’t be able to use it and wouldn’t work correctly.</p><p>This has been there, available since Python 3.7. And that behavior was planned to be the default for Python 3.10 onwards (not now, but keep reading).</p><h3>Pydantic and PEP 563</h3><p>Now, forward to the present, a couple of months ago.</p><p><a href="https://pydantic-docs.helpmanual.io/usage/postponed_annotations/">Pydantic already has <em>some</em> support</a> for using from __future__ import annotations in the code as made possible by PEP 563. And in many cases, it works fine. For example, this works:</p><pre>from __future__ import annotations<br>from typing import Optional</pre><pre>from fastapi import FastAPI<br>from pydantic import BaseModel<br></pre><pre># ✅ Pydantic models outside of functions will always work<br>class Item(BaseModel):<br>    name: str<br>    description: Optional[str] = None<br>    price: float<br></pre><pre>app = FastAPI()<br></pre><pre>@app.post(&quot;/items/&quot;)<br>def create_item(item: Item):<br>    return item</pre><p>But there are <em>some caveats</em> that wouldn’t work. For example, this doesn’t work:</p><pre>from __future__ import annotations<br>from typing import Optional</pre><pre>from fastapi import FastAPI<br>from pydantic import BaseModel<br></pre><pre>def create_app():<br>    # 🚨 Pydantic models INSIDE of functions would not work<br>    class Item(BaseModel):<br>        name: str<br>        description: Optional[str] = None<br>        price: float<br></pre><pre>    app = FastAPI()<br></pre><pre>    @app.post(&quot;/items/&quot;)<br>    def create_item(item: Item):<br>        return item<br>    return app<br></pre><pre>app = create_app()</pre><p>If you run that code, you would get a disconcerting error:</p><pre>NameError: name &#39;Item&#39; is not defined</pre><p>To solve it in this case, you could move the Item class outside of the function. And there are some other similar corner cases.</p><p>These types of disconcerting problems would be especially inconvenient for newcomers to Python (and probably to many experienced Python developers as well), as the problem is not obvious at all for someone that doesn’t know the internals (it wasn’t obvious to me, and I built FastAPI and Typer 😅).</p><p>Python is an example of a very inclusive global tech community, welcoming newcomers from all around the world, from many disciplines. It is being used to solve the most complex problems, including taking pictures of black holes, running drones on Mars, and building the most sophisticated artificial intelligence systems. But at the same time, it’s many people’s first programing language for its ease of use and its simplicity. And many Python developers don’t even consider themselves “developers”, even while they use it to solve problems.</p><p>So, having an inconvenience like this by default would not be ideal. There are other caveats but I don’t want to go deeper into the technical details than I already have. You can read more about them on the <a href="https://github.com/samuelcolvin/pydantic/issues/2678">Pydantic issue</a>, the <a href="https://mail.python.org/archives/list/python-dev@python.org/thread/QSASX6PZ3LIIFIANHQQFS752BJYFUFPY/#UITB2A657TAINAGWGRD6GCKWFC5PEBIZ">mailing list thread</a>, and <a href="https://mail.python.org/archives/list/python-dev@python.org/thread/ZBJ7MD6CSGM6LZAOTET7GXAVBZB7O77O/">Łukasz’s detailed explanation</a>.</p><h3>PEP 649 — Deferred Evaluation Of Annotations Using Descriptors</h3><p>Recently, Larry Hastings that had been working on an alternative to PEP 563, <a href="https://www.python.org/dev/peps/pep-0649/">PEP 649</a>, contacted Samuel Colvin (Pydantic’s author) and me (author of FastAPI and Typer), as suggested by <a href="https://twitter.com/brettsky">Brett Cannon</a> (from the Python Steering Council), to see if and how those changes would affect us.</p><p>We realized that the changes from PEP 563 (the other one) would be permanently added to Python 3.10 (not requiring the from __future__ import annotations), and the caveats and problems still didn&#39;t have a solution.</p><p>Suddenly it was also clear that these use cases of using type annotations at runtime instead of only for static analysis were not an obvious use case for everyone involved, including the same Larry Hastings who was working on what would be a potential solution for these use cases.</p><h3>Asking for Reconsideration</h3><p>Sadly, we realized all this very late, only weeks before these changes would be set in stone in Python 3.10 (in the end they weren’t). Nevertheless, we showed our concerns.</p><p>If you read about all this before, that’s probably why. It was shared a lot, and it got a bit out of hand.</p><p>And sadly, there were some radical comments attacking several of the parts involved (the Python Steering Council, us, etc), as if it was a fight between different groups. 😕</p><p>In reality, we are just one big group, the Python Community, and we are all trying to do the best for all of us.</p><p>Sadly, all this sudden friction brought a lot of increased stress to all the parties involved. To the Python Steering Council, Core Python Developers, and us, library authors.</p><p>Fortunately, everything came out well in the end.</p><p>Here’s a big shoutout to <a href="https://twitter.com/willingcarol">Carol Willing</a> that, despite the added stress generated for her and everyone else involved, she helped a lot reconciling different points of view, reducing the friction, and calming down all the situation. That capacity of acknowledging and adopting other’s points of view is priceless. We need more Carol Willings in the world. 🤓</p><h3>Python Steering Council decision</h3><p>In case you didn’t know, the decision of what goes into Python and what doesn’t is done by the <strong>Python Steering Council</strong>.</p><p>It is currently formed by:</p><ul><li>Barry Warsaw</li><li>Brett Cannon</li><li>Carol Willing</li><li>Pablo Galindo Salgado</li><li>Thomas Wouters</li></ul><p>Now, back to the story, after a couple of days of that previous discussion, during the next Python Steering Council meeting, they <a href="https://mail.python.org/archives/list/python-dev@python.org/thread/CLVXXPQ2T2LQ5MP2Y53VVQFCXYWQJHKZ/">unanimously decided to roll back the decision</a> of making these type annotations as strings (as described in PEP 563) being the default behavior.</p><p>Having those string type annotations by default in Python 3.10 had been decided some time ago, and rolling that change back only weeks before the “feature freeze” (the moment where no more changes are accepted into the next version) was a big decision, involving a lot of extra stress and effort.</p><p>Nevertheless, they took the decision in order to support the community of users of FastAPI, Pydantic, and other libraries using these features:</p><blockquote><em>We can’t risk breaking even a small subset of the FastAPI/pydantic users, not to mention other uses of evaluated type annotations that we’re not aware of yet.</em></blockquote><p>This, again, shows the strong commitment of the Python community, starting from the Steering Council, to be inclusive, and supportive of all users, with different use cases.</p><p>Here’s another big shoutout to <a href="https://twitter.com/pyblogsal">Pablo Galindo</a>, who <a href="https://github.com/samuelcolvin/pydantic/issues/2678#issuecomment-823569522">took all the extra work</a> to perform all the last-minute changes, and even voted in favor of them.</p><h3>What’s Next</h3><p>The decision was to keep the current behavior, of allowing from __future__ import annotations in the code, as defined by PEP 563, but not as the default behavior.</p><p>This will provide enough time to find a solution or an alternative that works for all the use cases, including Pydantic, FastAPI, and also the use cases that are interested exclusively in static analysis.</p><p>This is the best possible outcome for everyone. 🎉</p><p>It gives enough time to find an alternate solution and it avoids hurried decisions with little time that could have unknown negative effects.</p><h3>Who cares about FastAPI and Pydantic</h3><p>Now, in general, how does the future of FastAPI and Pydantic look like? Who cares about them?</p><p>FastAPI, using Pydantic, was included for the first time in the last Python Developer Survey, and despite being the first year in it, it was already <a href="https://www.jetbrains.com/lp/python-developers-survey-2020/#FrameworksLibraries">ranked as the third most popular web framework</a>, after Flask and Django. This shows that it’s being useful for many people.</p><p>It was also included in the latest <a href="https://www.thoughtworks.com/radar/languages-and-frameworks?blipid=202104087">ThoughtWorks Technology Radar</a>, as one of the technologies that enterprises should start trying out.</p><p>FastAPI and Pydantic are currently being used by many products and organizations, from the biggest ones you’ve heard of, to the smallest teams, including solo developers.</p><p>Several popular and widely used cloud providers, SaaS tools, databases, etc. are adding documentation, tutorials, and even improving their offers to better serve the FastAPI users.</p><p>The most popular code editors for Python, <a href="https://www.jetbrains.com/pycharm/">PyCharm</a> and <a href="https://code.visualstudio.com/">Visual Studio Code</a>, have been working on improving their support for FastAPI and Pydantic. I have even talked to both teams directly. 🤓</p><p>This is particularly interesting because FastAPI was designed to have the best support from editors, to provide the best developer experience possible. FastAPI and Pydantic use almost exclusively standard features of the language. When editors improve their support (even more) for these tools, they are actually improving their support for the features of the language itself. And this benefits many other use cases apart from FastAPI and Pydantic.</p><h3>Conclusion</h3><p>Python is a great community.</p><p>We are all trying to make it better for all of us, from the Steering Council and Core Developers to library authors and even <a href="https://fastapi.tiangolo.com/fastapi-people/">those who help others</a> using these libraries.</p><p>FastAPI and Pydantic are part of this community that includes and supports everyone, with all their use cases.</p><p>And that’s the main reason why the future of FastAPI and Pydantic is so bright. Because the future of Python is bright. We all make this future. ✨</p><h3>Thanks</h3><p>Thanks to everyone involved in finding a solution and improving the Python community. 🙇</p><p>And special thanks to:</p><ul><li><a href="https://twitter.com/llanga">Łukasz Langa</a></li><li><a href="https://twitter.com/willingcarol">Carol Willing</a></li><li><a href="https://twitter.com/samuel_colvin">Samuel Colvin</a></li></ul><p>for their review and feedback on this article before publishing.</p><h3>About me</h3><p>Hey! 👋 I’m Sebastián Ramírez (<a href="https://tiangolo.com/">tiangolo</a>).</p><p>You can follow me, contact me, see what I do, or use my open source code:</p><ul><li><a href="https://github.com/tiangolo">GitHub: tiangolo</a></li><li><a href="https://twitter.com/tiangolo">Twitter: tiangolo</a></li><li><a href="https://www.linkedin.com/in/tiangolo/">LinkedIn: tiangolo</a></li><li><a href="https://dev.to/tiangolo">Dev: tiangolo.to</a></li><li><a href="https://tiangolo.medium.com/">Medium: tiangolo</a></li><li><a href="https://tiangolo.com/">Web: tiangolo.com</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=2d1785a603a9" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Deploying FastAPI (and other) apps with HTTPS powered by Traefik]]></title>
            <link>https://tiangolo.medium.com/deploying-fastapi-and-other-apps-with-https-powered-by-traefik-e30b5058b98d?source=rss-963974981597------2</link>
            <guid isPermaLink="false">https://medium.com/p/e30b5058b98d</guid>
            <category><![CDATA[fastapi]]></category>
            <category><![CDATA[https]]></category>
            <category><![CDATA[traefik]]></category>
            <category><![CDATA[cloud-computing]]></category>
            <dc:creator><![CDATA[Sebastián Ramírez]]></dc:creator>
            <pubDate>Sat, 06 Mar 2021 19:17:56 GMT</pubDate>
            <atom:updated>2021-03-06T19:20:05.402Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*-s3HRAOLKthDiUG1oQrs5g.png" /></figure><p>This article lives in:</p><ul><li><a href="https://dev.to/tiangolo/deploying-fastapi-and-other-apps-with-https-powered-by-traefik-5dik">Dev.to</a></li><li><a href="https://tiangolo.medium.com/deploying-fastapi-and-other-apps-with-https-powered-by-traefik-e30b5058b98d">Medium</a></li><li><a href="https://github.com/tiangolo/blog-posts/blob/master/deploying-fastapi-apps-with-https-powered-by-traefik/README.md">GitHub</a></li></ul><h3>Intro</h3><p>Let’s say you have a <strong>FastAPI</strong> application… or actually, any other type of web application, including a <a href="https://panel.holoviz.org/">Panel</a> dashboard with Pandas DataFrames and Bokeh visualizations, or a <a href="https://www.streamlit.io/">Streamlit</a> application. These are, in the end, web applications. You could think of many other examples.</p><p>Now let’s say it all works well locally, on your machine.</p><p>But in most cases, the purpose of these web apps is to be available on the real web (not only on your machine), so that others can actually access them.</p><p>So you need to “deploy” them somewhere, on a remote server.</p><p>And then you would want to have secure communication between your app clients (web browsers, mobile apps, etc.) and your server web application.</p><p>So, you should have <strong>HTTPS</strong>. 🔒</p><p>But although it might sound like a simple “option” to enable, it’s quite more complex than that… and <a href="https://doc.traefik.io/traefik/">Traefik</a> can help you a lot.</p><p>I have been a long-time fan of Traefik, even before creating FastAPI.</p><p>And recently I had the chance to make an event/webinar with them. 🎉</p><p>You can watch the recording of the <a href="https://traefik.io/resources/traefik-fastapi-kuberrnetes-ai-ml/?utm_campaign=Influencer:%20Sebastian%20Ramirez,%20FastAPI%20&amp;utm_content=155438367&amp;utm_medium=social&amp;utm_source=twitter&amp;hss_channel=tw-4890312130">video here on the Traefik resources’ website</a>.</p><h3>About HTTPS</h3><p>HTTPS is quite more complex than “enabling an option”.</p><p>The protocol any of your applications will need to “talk” is actually the same HTTP, so you don’t have to change anything in your web apps to change from HTTP to HTTPS.</p><p>But that HTTP communication has to go through a secure connection (TLS/SSL), that’s where the “S” in HTTPS comes from, “HTTP Secure”.</p><p>There’s a whole process required, including acquiring HTTPS (TLS/SSL) certificates. But fortunately, <a href="https://letsencrypt.org/">Let’s Encrypt</a> provides them for free… you just have to set everything up.</p><p>But then, “setting everything up” including acquiring the certificates, installing them where appropriate, renewing them every three months, etc. It’s all a relatively complex process. But Traefik can do all that for you.</p><p>To quickly learn how HTTPS works from the consumer’s perspective, I highly encourage you to go and check <a href="https://howhttps.works/">HowHTTPS.works</a>.</p><p>Then you can go and read the short summary of what you need to know as a <em>developer</em> in the <a href="https://fastapi.tiangolo.com/deployment/https/">FastAPI docs: About HTTPS</a>.</p><h3>Domain name</h3><p>HTTPS is tied to a domain name because the TLS certificate is for that specific domain name.</p><p>So, you need to have one or buy one.</p><p>I buy my domains at <a href="https://www.name.com/referral/16bb2">Name.com</a>, it’s quite cheap and it has worked quite well for me.</p><h3>Remote server</h3><p>You will also need a “cloud” or remote server.</p><p>It’s frequently called a “VPS” for “virtual private server”. It’s a “private server” because you get a full Linux system with full control of it (contrary to a “shared hosting”). And it’s “virtual” because what providers do is create a virtual machine and make it available for you, instead of installing a real physical server, that’s why they are affordable.</p><p>For simplicity, I would suggest these providers:</p><ul><li><a href="https://m.do.co/c/fc6c7539f7a9">DigitalOcean</a></li><li><a href="https://www.linode.com/?r=8ee6f60b2ddb258bba3fefe264771bca3660fb97">Linode</a></li><li><a href="https://www.vultr.com/?ref=7529603">Vultr</a></li></ul><p>I personally have things in each one of those. They all work great, they have a simple and nice user experience, and are quite cheap.</p><p>Even $5 or $10 USD a month is enough to have one of the small servers up and running.</p><p>You can also go and use one of the giant cloud infrastructure providers if you want, learn all their terminology and components, set up all the accounts, permissions, etc. And then use them. But for this example, I would suggest one of the three above as it will be a lot simpler.</p><h3>DNS records</h3><p>When you create a remote server, it will have a public IP.</p><p>But now you need to configure your domain to point to that IP, so that when your users go to your domain, they end up talking to your remote server in its IP.</p><p>There’s a set of “records” that do that, they are called “DNS records” (DNS for “Domain Name System”).</p><p>Those records are stored in “Name Servers”. All of these cloud providers above have free Name Servers, so you can use them to store that information about pointing domains to IPs.</p><p><strong>Tip</strong>: those same DNS records are also used for configuring email, and other related small things.</p><p><strong>Note</strong>: all these Name Server and DNS changes are automatically copied and replicated through the web so that everyone in the world knows where to access the information about your domain, and then, with that, they will know to which IP they should talk to when interacting with your domain. Because that replication takes some time, after you save some of these changes, they can take from minutes to hours to be ready.</p><h3>Name Servers</h3><p>The first step is in your “registrar” (the entity that sold you the domain, e.g. Name.com). In there, you define what are the Name Servers for your domain.</p><p>You will probably first want to remove the default Name Servers. After buying a domain, the default Name Servers are normally the ones for the same registrar (e.g. Name.com), and normally all they do is have <strong>DNS records</strong> to point the domain to a placeholder page full of ads, but they normally don’t allow you to create <strong>DNS records</strong> (like pointing the domain to an IP address).</p><p>So, you will probably want to remove those default <strong>Name Servers</strong> and add the ones for your VPS provider.</p><p>E.g. you could add the Name Servers for DigitalOcean:</p><pre>ns1.digitalocean.com<br>ns2.digitalocean.com<br>ns3.digitalocean.com</pre><h3>DNS Records</h3><p>After you configure the <strong>Name Servers</strong> for your domain to be the ones for your cloud provider, you can now go to that cloud provider and set up the <strong>DNS Records</strong>.</p><p>Depending on your cloud provider, they will have some section to configure “domains”, “domain zones”, or “networks”, in the end, they all refer to the same configurations for <strong>DNS records</strong> for a specific domain.</p><p>So, the next step is to create a configuration there for your specific domain (sometimes called a “domain zone”).</p><p>Then, inside of that domain configuration, you need to add a <strong>DNS record</strong> to point any web communication to your cloud server.</p><p>There are several types of DNS records, the one we need is an <strong>A record</strong>, when you are creating a DNS record, those are normally the default type as they are the most important one.</p><p>An <strong>A record</strong> has an <strong>IP</strong> and a <strong>hostname</strong>.</p><p>The <strong>IP</strong> would be the one for your remote server. You might need to go to the section in the dashboard where your server is located to copy that IP.</p><p>The <strong>hostname</strong> would be your domain or any sub-domain. So, if you bought example.com, you can set the record to example.com, or to somesubdomain.example.com or also a.long.sub.domain.example.com. In most cases, you can even use *.example.com, which will match any sub-domain and point it to the IP you specify.</p><p>You can create multiple <strong>A records</strong>, one for each domain or sub-domain. And each of them can point to different IPs. That’s also why you see some applications that use several domains, like dashboard.example.com and api.example.com, to handle different parts of the same system in different servers.</p><p><strong>Note</strong>: depending on the provider, you might need to use the symbol @ in the hostname to mean &quot;the same domain I&#39;m configuring&quot;, so, for the domain configurations for example.com, creating an <strong>A record</strong> with some IP and the hostname @ would mean &quot;point the same domain example.com to that IP address&quot;.</p><h3>Wait</h3><p>You might have to wait sometime for these DNS changes to replicate.</p><p>You can test if your computer already has access to the most recent version of your records with the tool ping from the command line. For example, checking for the domain tiangolo.com:</p><pre>$ ping tiangolo.com                               <br>PING tiangolo.com (104.198.14.52) 56(84) bytes of data.<br>64 bytes from 52.14.198.104.bc.googleusercontent.com (104.198.14.52): icmp_seq=1 ttl=103 time=204 ms<br>64 bytes from 52.14.198.104.bc.googleusercontent.com (104.198.14.52): icmp_seq=2 ttl=103 time=226 ms</pre><p>you can see the IP address is 104.198.14.52. If that&#39;s what you just configured, congrats!</p><p>The DNS records are ready. 🎉</p><h3>Check the video</h3><p>From this point, you should be able to <a href="https://traefik.io/resources/traefik-fastapi-kuberrnetes-ai-ml/?utm_campaign=Influencer:%20Sebastian%20Ramirez,%20FastAPI%20&amp;utm_content=155438367&amp;utm_medium=social&amp;utm_source=twitter&amp;hss_channel=tw-4890312130">follow the video recording</a> with all the explanations.</p><p>So I’ll keep the rest of this post as simple as possible, mainly showing you the config files so you can copy all the examples.</p><h3>Simple FastAPI app</h3><p>Let’s start with a basic <a href="https://fastapi.tiangolo.com/"><strong>FastAPI</strong></a> app.</p><p>I’m assuming that you know a bit about FastAPI, if you don’t, feel free to check the documentation, it is written as a tutorial.</p><p>If you want to see the explanation step by step, feel free to check the video.</p><p>The basic app we will use is in a file at ./app/main.py, with:</p><pre>from fastapi import FastAPI</pre><pre>app = FastAPI()<br></pre><pre>@app.get(&quot;/&quot;)<br>def read_main():<br>    return {&quot;message&quot;: &quot;Hello World of FastAPI with Traefik&quot;}</pre><h3>Dockerfile</h3><p>We will use <a href="https://www.docker.com/">Docker</a> to deploy everything.</p><p>So, make sure you install it.</p><p>Then we need a file at ./app/Dockerfile with:</p><pre>FROM tiangolo/uvicorn-gunicorn-fastapi:python3.8</pre><pre>COPY ./app /app/</pre><p>Notice that we are using the official FastAPI Docker image: tiangolo/uvicorn-gunicorn-fastapi:python3.8.</p><p>The official base Docker image does most of the work for us, so we just have to copy the code inside.</p><p>Make sure you have <a href="https://docs.docker.com/get-docker/">Docker installed</a> on your local computer and in the remote server.</p><h3>Prepare your cloud server</h3><ul><li>Connect to your remote server from your terminal with SSH, it could be something like:</li></ul><pre>ssh root@fastapi-with-traefik.example.com</pre><ul><li>Update the list of package versions available:</li></ul><pre>apt update</pre><ul><li>Upgrade the packages to the latest version:</li></ul><pre>apt upgrade</pre><h3>Docker Compose</h3><p>We are using Docker Compose to manage all the configurations. So make you <a href="https://docs.docker.com/compose/install/">install Docker Compose</a> locally and on the remote server.</p><p>To prevent Docker Compose from hanging, install haveged:</p><pre>apt install haveged</pre><p><strong>Technical Details</strong>: Docker Compose uses the internal pseudo-random number generators of the machine. But in a freshly installed/created cloud server, it might not have enough of that “randomness”. And that could make the Docker Compose commands hang forever waiting for enough “randomness” to use. haveged prevents/fixes that issue.</p><p>After that, you can check that Docker Compose works correctly.</p><h3>Docker Compose files</h3><p>For all the detailed explanations of the Docker Compose files, <a href="https://traefik.io/resources/traefik-fastapi-kuberrnetes-ai-ml/?utm_campaign=Influencer:%20Sebastian%20Ramirez,%20FastAPI%20&amp;utm_content=155438367&amp;utm_medium=social&amp;utm_source=twitter&amp;hss_channel=tw-4890312130">check the video recording</a>.</p><p>Make sure you update the domains from example.com to use yours, and the email to register with Let&#39;s Encrypt, you will receive notifications about your expiring certificates in that email.</p><p>Also, make sure you add the right <strong>DNS records</strong> for your main application, and for the Traefik dashboard, and update them in the Docker Compose files accordingly.</p><p>Here are the Docker Compose files if you want to easily copy them.</p><ul><li>docker-compose.traefik.yml:</li></ul><pre>services:</pre><pre>  traefik:<br>    # Use the latest v2.3.x Traefik image available<br>    image: traefik:v2.3<br>    ports:<br>      # Listen on port 80, default for HTTP, necessary to redirect to HTTPS<br>      - 80:80<br>      # Listen on port 443, default for HTTPS<br>      - 443:443<br>    restart: always<br>    labels:<br>      # Enable Traefik for this service, to make it available in the public network<br>      - traefik.enable=true<br>      # Define the port inside of the Docker service to use<br>      - traefik.http.services.traefik-dashboard.loadbalancer.server.port=8080<br>      # Make Traefik use this domain in HTTP<br>      - traefik.http.routers.traefik-dashboard-http.entrypoints=http<br>      - traefik.http.routers.traefik-dashboard-http.rule=Host(`traefik.fastapi-with-traefik.example.com`)<br>      # Use the traefik-public network (declared below)<br>      - traefik.docker.network=traefik-public<br>      # traefik-https the actual router using HTTPS<br>      - traefik.http.routers.traefik-dashboard-https.entrypoints=https<br>      - traefik.http.routers.traefik-dashboard-https.rule=Host(`traefik.fastapi-with-traefik.example.com`)<br>      - traefik.http.routers.traefik-dashboard-https.tls=true<br>      # Use the &quot;le&quot; (Let&#39;s Encrypt) resolver created below<br>      - traefik.http.routers.traefik-dashboard-https.tls.certresolver=le<br>      # Use the special Traefik service api@internal with the web UI/Dashboard<br>      - traefik.http.routers.traefik-dashboard-https.service=api@internal<br>      # https-redirect middleware to redirect HTTP to HTTPS<br>      - traefik.http.middlewares.https-redirect.redirectscheme.scheme=https<br>      - traefik.http.middlewares.https-redirect.redirectscheme.permanent=true<br>      # traefik-http set up only to use the middleware to redirect to https<br>      - traefik.http.routers.traefik-dashboard-http.middlewares=https-redirect<br>      # admin-auth middleware with HTTP Basic auth<br>      # Using the environment variables USERNAME and HASHED_PASSWORD<br>      - traefik.http.middlewares.admin-auth.basicauth.users=${USERNAME?Variable not set}:${HASHED_PASSWORD?Variable not set}<br>      # Enable HTTP Basic auth, using the middleware created above<br>      - traefik.http.routers.traefik-dashboard-https.middlewares=admin-auth<br>    volumes:<br>      # Add Docker as a mounted volume, so that Traefik can read the labels of other services<br>      - /var/run/docker.sock:/var/run/docker.sock:ro<br>      # Mount the volume to store the certificates<br>      - traefik-public-certificates:/certificates<br>    command:<br>      # Enable Docker in Traefik, so that it reads labels from Docker services<br>      - --providers.docker<br>      # Do not expose all Docker services, only the ones explicitly exposed<br>      - --providers.docker.exposedbydefault=false<br>      # Create an entrypoint &quot;http&quot; listening on port 80<br>      - --entrypoints.http.address=:80<br>      # Create an entrypoint &quot;https&quot; listening on port 443<br>      - --entrypoints.https.address=:443<br>      # Create the certificate resolver &quot;le&quot; for Let&#39;s Encrypt, uses the environment variable EMAIL<br>      - --certificatesresolvers.le.acme.email=admin@example.com<br>      # Store the Let&#39;s Encrypt certificates in the mounted volume<br>      - --certificatesresolvers.le.acme.storage=/certificates/acme.json<br>      # Use the TLS Challenge for Let&#39;s Encrypt<br>      - --certificatesresolvers.le.acme.tlschallenge=true<br>      # Enable the access log, with HTTP requests<br>      - --accesslog<br>      # Enable the Traefik log, for configurations and errors<br>      - --log<br>      # Enable the Dashboard and API<br>      - --api<br>    networks:<br>      # Use the public network created to be shared between Traefik and<br>      # any other service that needs to be publicly available with HTTPS<br>      - traefik-public</pre><pre>volumes:<br>  # Create a volume to store the certificates, there is a constraint to make sure<br>  # Traefik is always deployed to the same Docker node with the same volume containing<br>  # the HTTPS certificates<br>  traefik-public-certificates:</pre><pre>networks:<br>  # Use the previously created public network &quot;traefik-public&quot;, shared with other<br>  # services that need to be publicly available via this Traefik<br>  traefik-public:<br>    external: true</pre><ul><li>docker-compose.yml:</li></ul><pre>services:</pre><pre>  backend:<br>    build: ./<br>    restart: always<br>    labels:<br>      # Enable Traefik for this specific &quot;backend&quot; service<br>      - traefik.enable=true<br>      # Define the port inside of the Docker service to use<br>      - traefik.http.services.app.loadbalancer.server.port=80<br>      # Make Traefik use this domain in HTTP<br>      - traefik.http.routers.app-http.entrypoints=http<br>      - traefik.http.routers.app-http.rule=Host(`fastapi-with-traefik.example.com`)<br>      # Use the traefik-public network (declared below)<br>      - traefik.docker.network=traefik-public<br>      # Make Traefik use this domain in HTTPS<br>      - traefik.http.routers.app-https.entrypoints=https<br>      - traefik.http.routers.app-https.rule=Host(`fastapi-with-traefik.example.com`)<br>      - traefik.http.routers.app-https.tls=true<br>      # Use the &quot;le&quot; (Let&#39;s Encrypt) resolver<br>      - traefik.http.routers.app-https.tls.certresolver=le<br>      # https-redirect middleware to redirect HTTP to HTTPS<br>      - traefik.http.middlewares.https-redirect.redirectscheme.scheme=https<br>      - traefik.http.middlewares.https-redirect.redirectscheme.permanent=true<br>      # Middleware to redirect HTTP to HTTPS<br>      - traefik.http.routers.app-http.middlewares=https-redirect<br>      - traefik.http.routers.app-https.middlewares=admin-auth<br>    networks:<br>      # Use the public network created to be shared between Traefik and<br>      # any other service that needs to be publicly available with HTTPS<br>      - traefik-public</pre><pre>networks:<br>  traefik-public:<br>    external: true</pre><ul><li>docker-compose.override.yml:</li></ul><pre>services:</pre><pre>  backend:<br>    ports:<br>      - 80:80</pre><pre>networks:<br>  traefik-public:<br>    external: false</pre><h3>Start the stacks</h3><p>There are many approaches for putting your code and Docker images on your server.</p><p>You could have a very sophisticated Continuous Integration system. But for this example using a simple rsync would be enough.</p><p>For example:</p><pre>rsync -a ./* root@fastapi-with-traefik.example.com:/root/code/fastapi-with-traefik/</pre><p>Then, inside of your server, make sure you create the Docker network:</p><pre>docker network create traefik-public</pre><p>Next, create the environment variables for HTTP Basic Auth.</p><ul><li>Create the username, e.g.:</li></ul><pre>export USERNAME=admin</pre><ul><li>Create an environment variable with the password, e.g.:</li></ul><pre>export PASSWORD=changethis</pre><ul><li>Use openssl to generate the &quot;hashed&quot; version of the password and store it in an environment variable:</li></ul><pre>export HASHED_PASSWORD=$(openssl passwd -apr1 $PASSWORD)</pre><p>And now you can start the <strong>Traefik</strong> Docker Compose stack:</p><pre>docker-compose -f docker-compose.traefik.yml up</pre><p>Next, start the main Docker Compose stack:</p><pre>docker-compose -f docker-compose.yml up -d</pre><h3>Check your app</h3><p>After that, if everything worked correctly (and probably it didn’t work correctly the first time 😅), you should be able to check your new application live at your domain, something like:</p><pre><a href="https://fastapi-with-traefik.example.com">https://fastapi-with-traefik.example.com</a></pre><p>And the Traefik dashboard at:</p><pre><a href="https://traefik.fastapi-with-traefik.example.com">https://traefik.fastapi-with-traefik.example.com</a></pre><p>And the Traefik dashboard would be protected by HTTP Basic Auth, so no one can go and tamper with your Traefik.</p><h3>Celebrate 🎉</h3><p>Congrats! That’s a very stable way to have a production application deployed.</p><p>You can probably improve that a lot, add Continuous Integration, monitoring, logging, use a complete cluster of machines instead of a single one (e.g. use Kubernetes instead of Docker Compose), etc. There’s no limit to adding more stuff and improving it all…</p><p>But with this, you already have the minimum to serve your users a secure application.</p><p>And as your deployment is based on Docker, and can be replicated easily and quickly, you could destroy that server, create a new one from scratch, and be live again in minutes. Because it doesn’t depend on <em>that</em> specific server.</p><p>All the important configurations and setup are in your Docker Compose files.</p><p>And all the important logic and setup of the actual app are in the Docker image (with the Dockerfile).</p><p>And Docker itself is taking care of having your application running, restarting it after failures or reboots, etc.</p><h3>Dessert 🍰</h3><p>Do you want a bit more?</p><p>Check <a href="https://github.com/tiangolo/blog-posts/blob/master/deploying-fastapi-apps-with-https-powered-by-traefik/">the source code for this blog post</a>, including the latest version of the app and config files, including a basic example with Panel, and one with Streamlit. ✨</p><h3>Learn More</h3><p>Here are some extra resources:</p><ul><li><a href="https://howhttps.works/">HowHTTPS.works</a>.</li><li><a href="https://fastapi.tiangolo.com/deployment/https/">FastAPI docs: HTTPS for developers</a>.</li><li>Event <a href="https://traefik.io/resources/traefik-fastapi-kuberrnetes-ai-ml/?utm_campaign=Influencer:%20Sebastian%20Ramirez,%20FastAPI%20&amp;utm_content=155438367&amp;utm_medium=social&amp;utm_source=twitter&amp;hss_channel=tw-4890312130">video recording in <strong>Traefik resources</strong></a>.</li><li><a href="https://github.com/tiangolo/blog-posts/blob/master/deploying-fastapi-apps-with-https-powered-by-traefik/">Source code in GitHub</a>.</li><li><a href="https://doc.traefik.io/traefik/">Traefik docs</a>.</li><li><a href="https://fastapi.tiangolo.com/">FastAPI docs</a>.</li></ul><p>I hope that was useful! 🚀</p><h3>About me</h3><p>Hey! 👋 I’m Sebastián Ramírez (<a href="https://tiangolo.com/">tiangolo</a>).</p><p>You can follow me, contact me, see what I do, or use my open source code:</p><ul><li><a href="https://github.com/tiangolo">GitHub: tiangolo</a></li><li><a href="https://twitter.com/tiangolo">Twitter: tiangolo</a></li><li><a href="https://www.linkedin.com/in/tiangolo/">LinkedIn: tiangolo</a></li><li><a href="https://dev.to/tiangolo">Dev: tiangolo.to</a></li><li><a href="https://tiangolo.medium.com/">Medium: tiangolo</a></li><li><a href="https://tiangolo.com/">Web: tiangolo.com</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=e30b5058b98d" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[FastAPI top-level dependencies]]></title>
            <link>https://tiangolo.medium.com/fastapi-top-level-dependencies-4d083a93a7ac?source=rss-963974981597------2</link>
            <guid isPermaLink="false">https://medium.com/p/4d083a93a7ac</guid>
            <category><![CDATA[fastapi]]></category>
            <category><![CDATA[python]]></category>
            <dc:creator><![CDATA[Sebastián Ramírez]]></dc:creator>
            <pubDate>Sun, 03 Jan 2021 11:17:44 GMT</pubDate>
            <atom:updated>2021-01-03T11:20:45.924Z</atom:updated>
            <content:encoded><![CDATA[<p>This article lives in:</p><ul><li><a href="https://dev.to/tiangolo/fastapi-top-level-dependencies-8ah">Dev.to</a></li><li><a href="https://tiangolo.medium.com/fastapi-top-level-dependencies-4d083a93a7ac">Medium</a></li><li><a href="https://github.com/tiangolo/blog-posts/blob/master/fastapi-top-level-dependencies/README.md">GitHub</a></li></ul><h3>Intro</h3><p>FastAPI version 0.62.0 comes with global dependencies that you can apply to a whole application.</p><p>As well as top-level dependencies, tags, and other parameters for APIRouters, that before were available only on app.include_router().</p><p>This makes it easier to put configurations and dependencies (e.g. for authentication) related to a group of <em>path operations</em> more closely together. 🔒</p><p>Let’s start by checking APIRouter...</p><h3>Include a router</h3><p>Imagine you had a file users.py with:</p><pre>from fastapi import APIRouter</pre><pre>router = APIRouter()<br></pre><pre>@router.get(&quot;/users/&quot;)<br>def read_users():<br>    return [&quot;rick&quot;, &quot;morty&quot;]</pre><p>And now let’s say you want to include it in the main.py file with:</p><pre>from fastapi import FastAPI, Depends<br>from . import users<br>from .dependencies import get_query_token</pre><pre>app = FastAPI()</pre><pre>app.include_router(<br>    users.router,<br>    tags=[&quot;users&quot;],<br>    dependencies=[Depends(get_query_token)]<br>)<br></pre><pre>@app.get(&quot;/&quot;)<br>def main():<br>    return {&quot;message&quot;: &quot;Hello World&quot;}</pre><p>In this example, you are applying the tag users to all the <em>path operations</em> in users.py. And you are also applying the dependency get_query_token to all of them.</p><p>This works, and it was the only/main way to do it up to version 0.62.0.</p><p>But what is not so great about it is that the tag and the dependency are mainly related to users.py, not to main.py. But that code had to live in main.py, instead of being closer to what it is related to.</p><h3>APIRouter top-level dependencies and tags</h3><p>Now, with FastAPI version 0.62.0, you can declare top-level dependencies, tags, and others in the APIRouter directly.</p><p>So, the new router.py can now look like:</p><pre>from fastapi import APIRouter, Depends<br>from .dependencies import get_query_token</pre><pre>router = APIRouter(<br>    tags=[&quot;users&quot;],<br>    dependencies=[Depends(get_query_token)]<br>)<br></pre><pre>@router.get(&quot;/users/&quot;)<br>def read_users():<br>    return [&quot;rick&quot;, &quot;morty&quot;]</pre><p>…notice the tags and dependencies in the APIRouter, they can now live closer to their related code! 🎉</p><p>And the main.py would be simply:</p><pre>from fastapi import FastAPI<br>from . import users</pre><pre>app = FastAPI()</pre><pre>app.include_router(<br>    users.router,<br>)<br></pre><pre>@app.get(&quot;/&quot;)<br>def main():<br>    return {&quot;message&quot;: &quot;Hello World&quot;}</pre><h3>Global dependencies</h3><p>In the same way, you can now also declare dependencies that apply to <strong>all</strong> the <em>path operations</em> in the FastAPI application:</p><pre>from fastapi import FastAPI, Depends<br>from .dependencies import get_query_token</pre><pre>app = FastAPI(<br>    dependencies=[Depends(get_query_token)]<br>)</pre><pre>@app.get(&quot;/&quot;)<br>def main():<br>    return {&quot;message&quot;: &quot;Hello World&quot;}</pre><h3>Tips</h3><p>Some tips to adopt a convention:</p><ul><li>By default, set all those configs in APIRouter().</li><li>Try to <strong>only</strong> set them in app.include_router() when you want to override some defaults that can&#39;t (or shouldn&#39;t) be set in APIRouter directly.</li><li>Set them in FastAPI() only when you want them to apply to everything, e.g. some default authentication for a simple app.</li></ul><h3>Learn More</h3><p>You can read more about <a href="https://fastapi.tiangolo.com/tutorial/dependencies/global-dependencies/">global dependencies</a>.</p><p>And about <a href="https://fastapi.tiangolo.com/tutorial/bigger-applications/">APIRouter top-level </a><a href="https://fastapi.tiangolo.com/tutorial/bigger-applications/">dependencies, </a><a href="https://fastapi.tiangolo.com/tutorial/bigger-applications/">tags, and others</a>.</p><p>If you don’t want to miss other news, you can subscribe to the <a href="https://fastapi.tiangolo.com/newsletter/"><strong>FastAPI and friends</strong> official newsletter</a>. 🎉</p><h3>About me</h3><p>Hey! 👋 I’m <a href="https://tiangolo.com/">tiangolo (Sebastián Ramírez)</a>.</p><p>You can follow me, contact me, see what I do, or use my open source code:</p><ul><li><a href="https://github.com/tiangolo">GitHub: tiangolo</a></li><li><a href="https://twitter.com/tiangolo">Twitter: tiangolo</a></li><li><a href="https://www.linkedin.com/in/tiangolo/">LinkedIn: tiangolo</a></li><li><a href="https://dev.to/tiangolo">Dev: tiangolo.to</a></li><li><a href="https://tiangolo.medium.com/">Medium: tiangolo</a></li><li><a href="https://tiangolo.com/">Web: tiangolo.com</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=4d083a93a7ac" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Concurrent Burgers — Understand async / await]]></title>
            <link>https://tiangolo.medium.com/concurrent-burgers-understand-async-await-eeec05ae7cfe?source=rss-963974981597------2</link>
            <guid isPermaLink="false">https://medium.com/p/eeec05ae7cfe</guid>
            <category><![CDATA[python]]></category>
            <category><![CDATA[asynchronous]]></category>
            <category><![CDATA[concurrency]]></category>
            <dc:creator><![CDATA[Sebastián Ramírez]]></dc:creator>
            <pubDate>Sun, 17 May 2020 07:03:21 GMT</pubDate>
            <atom:updated>2020-05-17T07:05:54.288Z</atom:updated>
            <content:encoded><![CDATA[<h3>Concurrent Burgers — Understand async / await</h3><p>This article lives in:</p><ul><li><a href="https://dev.to/tiangolo/concurrent-burgers-understand-async-await-3n20">Dev.to</a></li><li><a href="https://medium.com/@tiangolo/concurrent-burgers-understand-async-await-eeec05ae7cfe">Medium</a></li><li><a href="https://github.com/tiangolo/blog-posts/blob/master/concurrent-burgers-understand-async-await/README.md">GitHub</a></li><li><a href="https://fastapi.tiangolo.com/async/">FastAPI’s docs</a> (including translations to other languages)</li></ul><h3>Intro</h3><p>Modern versions of Python (and other languages) have support for <strong>“asynchronous code”</strong> using something called <strong>“coroutines”</strong>, with <strong>async and </strong><strong>await</strong> syntax.</p><p>Here’s a friendly and not very technical explanation to give some intuition about all that, including asynchronous code, concurrency, and parallelism.</p><p>This is taken from the <a href="https://fastapi.tiangolo.com/async/">docs for FastAPI</a>, a modern framework for building APIs in Python.</p><p>Although this was written for Python and FastAPI, all the story and information is relevant for other languages that also have async and await, like <strong>JavaScript</strong> and <strong>Rust</strong>.</p><p>Now, let’s see that phrase by parts in the sections below:</p><ul><li><strong>Asynchronous Code</strong></li><li><strong>async and </strong><strong>await</strong></li><li><strong>Coroutines</strong></li></ul><h3>Asynchronous Code</h3><p>Asynchronous code just means that the language 💬 has a way to tell the computer / program 🤖 that at some point in the code, it 🤖 will have to wait for <em>something else</em> to finish somewhere else. Let’s say that <em>something else</em> is called “slow-file” 📝.</p><p>So, during that time, the computer can go and do some other work, while “slow-file” 📝 finishes.</p><p>Then the computer / program 🤖 will come back every time it has a chance because it’s waiting again, or whenever it 🤖 finished all the work it had at that point. And it 🤖 will see if any of the tasks it was waiting for have already finished, doing whatever it had to do.</p><p>Next, it 🤖 takes the first task to finish (let’s say, our “slow-file” 📝) and continues whatever it had to do with it.</p><p>That “wait for something else” normally refers to I/O operations that are relatively “slow” (compared to the speed of the processor and the RAM memory), like waiting for:</p><ul><li>the data from the client to be sent through the network</li><li>the data sent by your program to be received by the client through the network</li><li>the contents of a file in the disk to be read by the system and given to your program</li><li>the contents your program gave to the system to be written to disk</li><li>a remote API operation</li><li>a database operation to finish</li><li>a database query to return the results</li><li>etc.</li></ul><p>As the execution time is consumed mostly by waiting for I/O operations, they call them “I/O bound” operations.</p><p>It’s called “asynchronous” because the computer / program doesn’t have to be “synchronized” with the slow task, waiting for the exact moment that the task finishes, while doing nothing, to be able to take the task result and continue the work.</p><p>Instead of that, by being an “asynchronous” system, once finished, the task can wait in line a little bit (some microseconds) for the computer / program to finish whatever it went to do, and then come back to take the results and continue working with them.</p><p>For “synchronous” (contrary to “asynchronous”) they commonly also use the term “sequential”, because the computer / program follows all the steps in sequence before switching to a different task, even if those steps involve waiting.</p><h3>Concurrency and Burgers</h3><p>This idea of <strong>asynchronous</strong> code described above is also sometimes called <strong>“concurrency”</strong>. It is different from <strong>“parallelism”</strong>.</p><p><strong>Concurrency</strong> and <strong>parallelism</strong> both relate to “different things happening more or less at the same time”.</p><p>But the details between <em>concurrency</em> and <em>parallelism</em> are quite different.</p><p>To see the difference, imagine the following story about burgers:</p><h3>Concurrent Burgers</h3><p>You go with your crush 😍 to get fast food 🍔, you stand in line while the cashier 💁 takes the orders from the people in front of you.</p><p>Then it’s your turn, you place your order of 2 very fancy burgers 🍔 for your crush 😍 and you.</p><p>You pay 💸.</p><p>The cashier 💁 says something to the guy in the kitchen 👨‍🍳 so he knows he has to prepare your burgers 🍔 (even though he is currently preparing the ones for the previous clients).</p><p>The cashier 💁 gives you the number of your turn.</p><p>While you are waiting, you go with your crush 😍 and pick a table, you sit and talk with your crush 😍 for a long time (as your burgers are very fancy and take some time to prepare ✨🍔✨).</p><p>As you are sitting on the table with your crush 😍, while you wait for the burgers 🍔, you can spend that time admiring how awesome, cute and smart your crush is ✨😍✨.</p><p>While waiting and talking to your crush 😍, from time to time, you check the number displayed on the counter to see if it’s your turn already.</p><p>Then at some point, it finally is your turn. You go to the counter, get your burgers 🍔 and come back to the table.</p><p>You and your crush 😍 eat the burgers 🍔 and have a nice time ✨.</p><p>Imagine you are the computer / program 🤖 in that story.</p><p>While you are at the line, you are just idle 😴, waiting for your turn, not doing anything very “productive”. But the line is fast because the cashier 💁 is only taking the orders (not preparing them), so that’s fine.</p><p>Then, when it’s your turn, you do actual “productive” work 🤓, you process the menu, decide what you want, get your crush’s 😍 choice, pay 💸, check that you give the correct bill or card, check that you are charged correctly, check that the order has the correct items, etc.</p><p>But then, even though you still don’t have your burgers 🍔, your work with the cashier 💁 is “on pause” ⏸, because you have to wait 🕙 for your burgers to be ready.</p><p>But as you go away from the counter and sit on the table with a number for your turn, you can switch 🔀 your attention to your crush 😍, and “work” ⏯ 🤓 on that. Then you are again doing something very “productive” 🤓, as is flirting with your crush 😍.</p><p>Then the cashier 💁 says “I’m finished with doing the burgers” 🍔 by putting your number on the counter’s display, but you don’t jump like crazy immediately when the displayed number changes to your turn number. You know no one will steal your burgers 🍔 because you have the number of your turn, and they have theirs.</p><p>So you wait for your crush 😍 to finish the story (finish the current work ⏯ / task being processed 🤓), smile gently and say that you are going for the burgers ⏸.</p><p>Then you go to the counter 🔀, to the initial task that is now finished ⏯, pick the burgers 🍔, say thanks and take them to the table. That finishes that step / task of interaction with the counter ⏹. That in turn, creates a new task, of “eating burgers” 🔀 ⏯, but the previous one of “getting burgers” is finished ⏹.</p><h3>Parallel Burgers</h3><p>Now let’s imagine these aren’t “Concurrent Burgers”, but “Parallel Burgers”.</p><p>You go with your crush 😍 to get parallel fast food 🍔.</p><p>You stand in line while several (let’s say 8) cashiers that at the same time are cooks 👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳 take the orders from the people in front of you.</p><p>Everyone before you is waiting 🕙 for their burgers 🍔 to be ready before leaving the counter because each of the 8 cashiers goes himself and prepares the burger right away before getting the next order.</p><p>Then it’s finally your turn, you place your order of 2 very fancy burgers 🍔 for your crush 😍 and you.</p><p>You pay 💸.</p><p>The cashier goes to the kitchen 👨‍🍳.</p><p>You wait, standing in front of the counter 🕙, so that no one else takes your burgers 🍔 before you do, as there are no numbers for turns.</p><p>As you and your crush 😍 are busy not letting anyone get in front of you and take your burgers whenever they arrive 🕙, you cannot pay attention to your crush 😞.</p><p>This is “synchronous” work, you are “synchronized” with the cashier/cook 👨‍🍳. You have to wait 🕙 and be there at the exact moment that the cashier/cook 👨‍🍳 finishes the burgers 🍔 and gives them to you, or otherwise, someone else might take them.</p><p>Then your cashier/cook 👨‍🍳 finally comes back with your burgers 🍔, after a long time waiting 🕙 there in front of the counter.</p><p>You take your burgers 🍔 and go to the table with your crush 😍.</p><p>You just eat them, and you are done 🍔 ⏹.</p><p>There was not much talk or flirting as most of the time was spent waiting 🕙 in front of the counter 😞.</p><p>In this scenario of the parallel burgers, you are a computer / program 🤖 with two processors (you and your crush 😍), both waiting 🕙 and dedicating their attention ⏯ to be “waiting on the counter” 🕙 for a long time.</p><p>The fast food store has 8 processors (cashiers/cooks) 👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳. While the concurrent burgers store might have had only 2 (one cashier and one cook) 💁 👨‍🍳.</p><p>But still, the final experience is not the best 😞.</p><p>This would be the parallel equivalent story for burgers 🍔.</p><p>For a more “real life” example of this, imagine a bank.</p><p>Up to recently, most of the banks had multiple cashiers 👨‍💼👨‍💼👨‍💼👨‍💼 and a big line 🕙🕙🕙🕙🕙🕙🕙🕙.</p><p>All of the cashiers doing all the work with one client after the other 👨‍💼⏯.</p><p>And you have to wait 🕙 in the line for a long time or you lose your turn.</p><p>You probably wouldn’t want to take your crush 😍 with you to do errands at the bank 🏦.</p><h3>Burger Conclusion</h3><p>In this scenario of “fast food burgers with your crush”, as there is a lot of waiting 🕙, it makes a lot more sense to have a concurrent system ⏸🔀⏯.</p><p>This is the case for most of the web applications.</p><p>Many, many users, but your server is waiting 🕙 for their not-so-good connection to send their requests.</p><p>And then waiting 🕙 again for the responses to come back.</p><p>This “waiting” 🕙 is measured in microseconds, but still, summing it all, it’s a lot of waiting in the end.</p><p>That’s why it makes a lot of sense to use asynchronous ⏸🔀⏯ code for web APIs.</p><p>Most of the existing popular Python frameworks (including Flask and Django) were created before the new asynchronous features in Python existed. So, the ways they can be deployed support parallel execution and an older form of asynchronous execution that is not as powerful as the new capabilities.</p><p>Even though the main specification for asynchronous web Python (ASGI) was developed at Django, to add support for WebSockets.</p><p>That kind of asynchronicity is what made NodeJS popular (even though NodeJS is not parallel) and that’s the strength of Go as a programing language.</p><p>And that’s the same level of performance you get with <strong>FastAPI</strong>.</p><p>And as you can have parallelism and asynchronicity at the same time, you get higher performance than most of the tested NodeJS frameworks and on par with Go, which is a compiled language closer to C <a href="https://www.techempower.com/benchmarks/#section=data-r17&amp;hw=ph&amp;test=query&amp;l=zijmkf-1">(all thanks to Starlette)</a>.</p><h3>Is concurrency better than parallelism?</h3><p>Nope! That’s not the moral of the story.</p><p>Concurrency is different than parallelism. And it is better on <strong>specific</strong> scenarios that involve a lot of waiting. Because of that, it generally is a lot better than parallelism for web application development. But not for everything.</p><p>So, to balance that out, imagine the following short story:</p><blockquote><em>You have to clean a big, dirty house.</em></blockquote><p><em>Yep, that’s the whole story</em>.</p><p>There’s no waiting 🕙 anywhere, just a lot of work to be done, on multiple places of the house.</p><p>You could have turns as in the burgers example, first the living room, then the kitchen, but as you are not waiting 🕙 for anything, just cleaning and cleaning, the turns wouldn’t affect anything.</p><p>It would take the same amount of time to finish with or without turns (concurrency) and you would have done the same amount of work.</p><p>But in this case, if you could bring the 8 ex-cashier/cooks/now-cleaners 👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳👨‍🍳, and each one of them (plus you) could take a zone of the house to clean it, you could do all the work in <strong>parallel</strong>, with the extra help, and finish much sooner.</p><p>In this scenario, each one of the cleaners (including you) would be a processor, doing their part of the job.</p><p>And as most of the execution time is taken by actual work (instead of waiting), and the work in a computer is done by a CPU, they call these problems “CPU bound”.</p><p>Common examples of CPU bound operations are things that require complex math processing.</p><p>For example:</p><ul><li><strong>Audio</strong> or <strong>image processing</strong>.</li><li><strong>Computer vision</strong>: an image is composed of millions of pixels, each pixel has 3 values / colors, processing that normally requires computing something on those pixels, all at the same time.</li><li><strong>Machine Learning</strong>: it normally requires lots of “matrix” and “vector” multiplications. Think of a huge spreadsheet with numbers and multiplying all of them together at the same time.</li><li><strong>Deep Learning</strong>: this is a sub-field of Machine Learning, so, the same applies. It’s just that there is not a single spreadsheet of numbers to multiply, but a huge set of them, and in many cases, you use a special processor to build and / or use those models.</li></ul><h3>Concurrency + Parallelism: Web + Machine Learning</h3><p>With <strong>FastAPI</strong> you can take the advantage of concurrency that is very common for web development (the same main attractive of NodeJS).</p><p>But you can also exploit the benefits of parallelism and multiprocessing (having multiple processes running in parallel) for <strong>CPU bound</strong> workloads like those in Machine Learning systems.</p><p>That, plus the simple fact that Python is the main language for <strong>Data Science</strong>, Machine Learning and especially Deep Learning, make FastAPI a very good match for Data Science / Machine Learning web APIs and applications (among many others).</p><p>To see how to achieve this parallelism in production see the <a href="https://fastapi.tiangolo.com/deployment/">FastAPI docs for Deployment</a>.</p><h3>async and await</h3><p>Modern versions of Python have a very intuitive way to define asynchronous code. This makes it look just like normal “sequential” code and do the “awaiting” for you at the right moments.</p><p>When there is an operation that will require waiting before giving the results and has support for these new Python features, you can code it like:</p><pre>burgers = await get_burgers(2)</pre><p>The key here is the await. It tells Python that it has to wait ⏸ for get_burgers(2) to finish doing its thing 🕙 before storing the results in burgers. With that, Python will know that it can go and do something else 🔀 ⏯ in the meanwhile (like receiving another request).</p><p>For await to work, it has to be inside a function that supports this asynchronicity. To do that, you just declare it with async def:</p><pre>async def get_burgers(number: int):<br>    # Do some asynchronous stuff to create the burgers<br>    return burgers</pre><p>…instead of def:</p><pre># This is not asynchronous<br>def get_sequential_burgers(number: int):<br>    # Do some sequential stuff to create the burgers<br>    return burgers</pre><p>With async def, Python knows that, inside that function, it has to be aware of await expressions, and that it can &quot;pause&quot; ⏸ the execution of that function and go do something else 🔀 before coming back.</p><p>When you want to call an async def function, you have to &quot;await&quot; it. So, this won&#39;t work:</p><pre># This won&#39;t work, because get_burgers was defined with: async def<br>burgers = get_burgers(2)</pre><h3>Other forms of asynchronous code</h3><p>This style of using async and await is relatively new in the language.</p><p>But it makes working with asynchronous code a lot easier.</p><p>This same syntax (or almost identical) was also included recently in modern versions of JavaScript (in Browser and NodeJS).</p><p>But before that, handling asynchronous code was quite more complex and difficult.</p><p>In previous versions of Python, you could have used threads or <a href="http://www.gevent.org/">Gevent</a>. But the code is way more complex to understand, debug, and think about.</p><p>In previous versions of NodeJS / Browser JavaScript, you would have used “callbacks”. Which leads to <a href="http://callbackhell.com/">callback hell</a>.</p><h3>Coroutines</h3><p><strong>Coroutine</strong> is just the very fancy term for the thing returned by an async def function. Python knows that it is something like a function that it can start and that it will end at some point, but that it might be paused ⏸ internally too, whenever there is an await inside of it.</p><p>But all this functionality of using asynchronous code with async and await is many times summarized as using &quot;coroutines&quot;. It is comparable to the main key feature of Go, the &quot;Goroutines&quot;.</p><h3>Conclusion</h3><p>Let’s see the same phrase from above:</p><blockquote><em>Modern versions of Python have support for </em><strong><em>“asynchronous code”</em></strong><em> using something called </em><strong><em>“coroutines”</em></strong><em>, with </em><strong><em>async and </em></strong><strong><em>await</em></strong><em> syntax.</em></blockquote><p>That should make more sense now. ✨</p><p>All that is what powers FastAPI (through Starlette) and what makes it have such an impressive performance.</p><h3>Learn More</h3><p>This version has some of the details that are very specific to FastAPI trimmed out. If you want to learn those, including how to get asynchronous performance benefits while still writing standard def functions and using standard libraries, check the <a href="https://ps//fastapi.tiangolo.com/async/">FastAPI docs</a>.</p><p>If you want a deeper and much more technical explanation of all this, check this <a href="https://www.youtube.com/watch?v=Xbl7XjFYsN4">video series from EdgeDB</a> by <a href="https://twitter.com/llanga">Łukasz Langa</a>.</p><h3>About me</h3><p>Hey! 👋 I’m Sebastián Ramírez (<a href="https://tiangolo.com/">tiangolo</a>).</p><p>You can follow me, contact me, ask questions, see what I do, or use my open source code:</p><ul><li><a href="https://github.com/tiangolo">GitHub: tiangolo</a></li><li><a href="https://twitter.com/tiangolo">Twitter: tiangolo</a></li><li><a href="https://www.linkedin.com/in/tiangolo/">LinkedIn: tiangolo</a></li><li><a href="https://dev.to/tiangolo">Dev: tiangolo.to</a></li><li><a href="https://medium.com/@tiangolo">Medium: tiangolo</a></li><li><a href="https://tiangolo.com/">Web: tiangolo.com</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=eeec05ae7cfe" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Build a web API from scratch with FastAPI — the workshop]]></title>
            <link>https://tiangolo.medium.com/build-a-web-api-from-scratch-with-fastapi-the-workshop-866d089d23dc?source=rss-963974981597------2</link>
            <guid isPermaLink="false">https://medium.com/p/866d089d23dc</guid>
            <category><![CDATA[python]]></category>
            <category><![CDATA[tutorial]]></category>
            <category><![CDATA[fastapi]]></category>
            <dc:creator><![CDATA[Sebastián Ramírez]]></dc:creator>
            <pubDate>Thu, 27 Feb 2020 21:12:51 GMT</pubDate>
            <atom:updated>2020-02-27T21:13:54.942Z</atom:updated>
            <content:encoded><![CDATA[<h3>Build a web API from scratch with FastAPI — the workshop</h3><p>This article lives in:</p><ul><li><a href="https://dev.to/tiangolo/build-a-web-api-from-scratch-with-fastapi-the-workshop-2ehe">Dev.to</a></li><li><a href="https://medium.com/@tiangolo/build-a-web-api-from-scratch-with-fastapi-the-workshop-866d089d23dc">Medium</a></li><li><a href="https://github.com/tiangolo/blog-posts/blob/master/pyconby-web-api-from-scratch-with-fastapi/README.md">GitHub</a></li></ul><h3>The first FastAPI workshop at PyCon Belarus</h3><p>Last weekend I had the chance to go to <a href="https://by.pycon.org/">PyCon Belarus</a>, I had a great time and met a lot of great people.</p><p>I gave a talk there:</p><h3>Camila Gutiérrez on Twitter</h3><p>Serve ML models easily with FastAPI&quot; @tiangolo &#39;s conference talk in #pyconby just started! @pyconby 🎉🎉</p><p>And a workshop with about 60 people:</p><h3>PyConBY 2020 (Feb 21-22) Conference on Twitter</h3><p>Workshops Day is on! Learning more about FastAPI and Catalyst with @tiangolo and @tez_romach. #pyconby</p><h3>Creating the workshop</h3><p>When creating the workshop I got a bit excited and created too much content for the time I had available.</p><p>The final app ended up having basic OAuth2 authentication, authorization handling with dependencies, tests with full coverage, etc.</p><p>I “gave” a test trial of the full workshop to <a href="https://twitter.com/Mariacamilagl30">Camila</a> and the total time was about 9 hours, it wasn’t really possible to give it all in 3 hours.</p><p>But as it was made in incremental steps, completing a full new version of the app at every step (or every 2 steps), we could start it and go through it, step by step, and advance as much as possible. And wherever we ended up by the end would still be a valid version of the app.</p><h3>The first version of the workshop</h3><p>The speed of a workshop like this has a constant tradeoff, as there’s always people that finish some part faster than others, so, at some points some people will be “bored” while others will be stressed finishing some part before the next comes.</p><p>But nevertheless, at the workshop in PyCon Belarus developers were quite fast, and we were able to go up to version 8 of the app, while I was expecting to get only up to about version 5.</p><p>But there were 15 versions. So, for those that wanted to see the final version, here it is.</p><h3>Source code for the final version</h3><p>I don’t have an easy way to provide it step by step with all the explanations here, but if you are curious you can still check here the last version of the code.</p><p>It was all based on the same FastAPI documentation, so, if you want to understand it all you can just follow the full <a href="https://fastapi.tiangolo.com/tutorial/">FastAPI Tutorial — User Guide</a>.</p><p>Below are the initial setup instructions and then the link to the full code of the last version.</p><h3>Create a project directory</h3><p>Create a directory for the project.</p><p>For example ./apiapp/:</p><pre>$ mkdir apiapp<br>$ cd ./apiapp/</pre><h3>Create a Python environment</h3><p>In the ./apiapp/ directory, create a new Python environment.</p><p>You could be using Poetry, Pipenv, or other tools.</p><p>To make it simple we are going to use pure Python with the venv module.</p><p>Make sure you have a Python version &gt; 3.6:</p><pre>$ python --version</pre><pre># OR</pre><pre>$ python3.6 --version</pre><pre># OR</pre><pre>$ python3.7 --version</pre><p>Then use that Python 3.6+ to create a new environment for your project.</p><pre>python -m venv env</pre><p>That will create a directory ./env/ that will contain a full Python environment, with its own packages, etc.</p><p>And in that environment is that we are going to install packages and everything.</p><h3>Initialize git</h3><pre>$ git init</pre><h3>Ignore that environment in git</h3><p>Inside of that ./env/ directory, create a file .gitignore with the contents:</p><pre>*</pre><p>(that’s a single * in the file).</p><p>That will tell git that we want to ignore everything inside that directory.</p><h3>Activate the environment</h3><p>Now we need to “activate” the environment.</p><p>This will tell your terminal that when you try to run python it should use the new Python you just installed in that ./env/ directory and not the global one.</p><p>Activate the environment:</p><pre>$ source ./env/bin/activate</pre><p>To make sure that it worked, check which Python binary is used by your terminal.</p><pre>$ which python</pre><p>It should show the path of the new Python, something like:</p><pre>/home/user/code/apiapp/env/bin/python</pre><h4>Activate in Windows</h4><p>If you are in Windows, in Git Bash, activate with:</p><pre>$ source ./env/Scripts/activate</pre><p>If you are using PowerShell, activate with:</p><pre>$ .\env\Scripts\Activate.ps1</pre><p>To confirm which Python you have in PowerShell use:</p><pre>$ Get-Command python</pre><h4>Deactivate an environment</h4><p>We don’t need to deactivate the environment because we are going to use it. But if you need to deactivate it later, just run:</p><pre>$ deactivate</pre><h3>Open your editor</h3><p>Open your editor and select that environment.</p><h3>Visual Studio Code</h3><p>If you have Visual Studio Code and a shell like Bash, you can just run:</p><pre>$ code ./</pre><p>In any case, make sure you select the environment you just created for your editor.</p><p>If you use Visual Studio Code, make sure you have the <a href="https://code.visualstudio.com/docs/python/python-tutorial#_install-visual-studio-code-and-the-python-extension">Python Extension</a>.</p><p>You can then create a dummy file dummy.py and open it. That will make VS Code load the extension and show the Python environment used.</p><p>In the lower left corner you will see the Python version used, if you click it, you can select a different one.</p><p>After this, you can delete the dummy.py file.</p><h3>PyCharm</h3><p>If you use PyCharm as your editor, open it.</p><p>Select your project directory ./apiapp/ as the workspace.</p><p>Then <a href="https://www.jetbrains.com/help/pycharm/configuring-python-interpreter.html">Configure a Python interpreter</a> for your project, and select the interpreter inside of the environment you just created.</p><h3>Using the correct environment</h3><p>Using the correct environment in your editor as we described here and opening it exactly in your project directory will make your editor know the installed packages and will let it provide autocompletion, type checks, relative imports, etc.</p><p>If you didn’t configure the environment correctly or if you didn’t open it exactly in your project directory (for example, you open one directory above), your editor won’t be able to give you all those features.</p><h3>Create files</h3><p>Now, in your editor, create a directory app. It will store all your actual code.</p><p>Inside of that app directory, create 2 empty files main.py and __init__.py.</p><p>And inside of your project directory, right next to the app directory, create an empty requirements.txt file.</p><p>Your file structure should look like:</p><pre>apiapp<br>├── app<br>│   ├── __init__.py<br>│   └── main.py<br>├── requirements.txt<br>└── env<br>    └── ...</pre><h3>Requirements</h3><p>Edit your requirements.txt to have the following contents:</p><pre>fastapi<br>uvicorn<br>sqlalchemy<br>async-exit-stack<br>async-generator<br>passlib[bcrypt]<br>python-jose[cryptography]<br>python-multipart<br>python-dotenv</pre><h3>Install requirements</h3><p>Now install the requirements from that requirements.txt file in the terminal with:</p><pre>$ pip install -r requirements.txt</pre><h3>Dev requirements</h3><p>Now, to facilitate development, we’ll also add extra packages that will help us during development.</p><p>Create a file dev-requirements.txt with:</p><pre>black<br>mypy<br>flake8<br>requests<br>pytest<br>pytest-cov<br>isort<br>autoflake</pre><h3>Install dev requirements</h3><p>And now install the development requirements in the same way:</p><pre>$ pip install -r dev-requirements.txt</pre><h3>In VS Code</h3><p>Enable Language Server, mypy, black in the settings.</p><h3>Reload editor</h3><p>You might need to reload your editor for it to be able to detect the newly installed packages.</p><h3>Reload environment</h3><p>Right after installing new Python packages in your environment, you should activate the environment again:</p><pre>$ source ./env/bin/activate</pre><p>Note: If you are on Windows use the equivalent command.</p><p>This will ensure that packages that have a command, like uvicorn will be available in your terminal.</p><p>Make sure that uvicorn is available and is the correct version after installing and re-activating the environment:</p><pre>$ which uvicorn</pre><p>It should show the uvicorn from your environment.</p><h3>Note: Other package managers</h3><p>If you used a different environment and package manager like Poetry or Pipenv, the requirements.txt file would be a different file and it would be managed differently, but here we are using the simplest version with the pure/standard Python packages (venv, pip, etc).</p><h3>The app — version 1</h3><p>Now we are going to create the first version of our app.</p><h3>Edit — v1</h3><p>Edit the file main.py...</p><h3>Run — v1</h3><p>Run it:</p><pre>$ uvicorn app.main:app --reload</pre><ul><li>Edit main.py again, Uvicorn auto-reloads.</li></ul><h3>Final Version</h3><p>The final version of the source code is here: <a href="https://github.com/tiangolo/blog-posts/tree/master/pyconby-web-api-from-scratch-with-fastapi/apiapp">https://github.com/tiangolo/blog-posts/tree/master/pyconby-web-api-from-scratch-with-fastapi/apiapp</a></p><h3>Additional scripts</h3><p>There’s a script to run the tests and report coverage in HTML so that you can explore it in your browser:</p><pre>$ bash test.sh</pre><p>And there is a script to format all the code automatically:</p><pre>$ bash format.sh</pre><h3>About me</h3><p>You can follow me, contact me, ask questions, see what I do, or use my open source code:</p><ul><li><a href="https://github.com/tiangolo">GitHub</a></li><li><a href="https://twitter.com/tiangolo">Twitter</a></li><li><a href="https://www.linkedin.com/in/tiangolo/">Linkedin</a></li><li><a href="https://dev.to/tiangolo">Dev.to</a></li><li><a href="https://medium.com/@tiangolo">Medium</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=866d089d23dc" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[How to start contributing to open source]]></title>
            <link>https://tiangolo.medium.com/how-to-start-contributing-to-open-source-49488f3ad6e?source=rss-963974981597------2</link>
            <guid isPermaLink="false">https://medium.com/p/49488f3ad6e</guid>
            <category><![CDATA[open-source]]></category>
            <category><![CDATA[tutorial]]></category>
            <dc:creator><![CDATA[Sebastián Ramírez]]></dc:creator>
            <pubDate>Mon, 23 Dec 2019 07:04:15 GMT</pubDate>
            <atom:updated>2019-12-23T07:09:51.291Z</atom:updated>
            <content:encoded><![CDATA[<p>Here’s a tip to help you get started contributing to open source (if you haven’t started yet).</p><p>This article lives in:</p><ul><li><a href="https://dev.to/tiangolo/how-to-start-contributing-to-open-source-1jmg">Dev.to</a></li><li><a href="https://medium.com/@tiangolo/how-to-start-contributing-to-open-source-49488f3ad6e">Medium</a></li><li><a href="https://github.com/tiangolo/blog-posts/blob/master/how-to-start-contributing-to-open-source">GitHub</a></li></ul><h3>TL;DR</h3><p>(too long, didn’t read)</p><p>Newbies are great at docs, better than maintainers. Start with that.</p><h3>Find a problem</h3><p>First, find a problem that you want to solve, something that you care about.</p><p>If you see you can solve it easily without technology, cool, go and do that. And come back with a new problem ;)</p><p>Then find how technology could help to solve it, what kind of app or system would help.</p><h3>Find a project</h3><p>Then find an open source project that would help to solve that problem.</p><p>If there’s a tool that solves the whole problem, that’s great. But it’s more probable that you will have to divide the problem into smaller parts, and then find an open source tool that would help with some part of the problem.</p><h3>Study it</h3><p>Study that open source tool.</p><p>Read its documentation, try the examples, learn how to use it.</p><p>If it’s solving a problem you care about (or part of it) it will be useful, it will be interesting to study. And having a real-world objective for that tool helps you put the focus where it should be.</p><h3>Documentation</h3><p>While you study that tool, there are high chances that the documentation has obsolete parts, needs updates, clarifications, etc.</p><p>Or simply, it could have been explained better.</p><p>Now, that you are just starting and have a fresh point of view with that tool, it’s the perfect moment to improve documentation.</p><p>You need some steps first:</p><ul><li>Learn how to make a <a href="https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request">pull request</a> with change suggestions.</li><li>Learn about the docs for that project. How they are created. It’s normally some text files in a directory in the same project, or in a sibling project. But you normally can just edit the small part that you found without having to learn everything about that project’s docs system.</li><li><a href="https://help.github.com/en/github/getting-started-with-github/fork-a-repo">Fork that project</a> to get your own version to change and propose changes to the original.</li></ul><h3>Update the docs</h3><p>Now add documentation for that project.</p><p>Add the examples that would have helped <em>you</em> get started, with a focus on a simple real-life problem (like yours) that this tool would help solve.</p><p>Add the text that would have explained to <em>you</em> the same concepts more easily.</p><p>Add the definitions of the words the maintainer assumed were obvious but were required to know to be able to do the first steps.</p><p>Add documentation for the things that aren’t documented yet.</p><p>Update the things that seem complex or difficult. Once you understand them, think how someone could have explained them better to <em>you</em>, and write that.</p><p>Fix typos. There are always typos.</p><p>Any of those changes would probably be an independent pull request. Keep them separated, with the minimum changes, with a very clear scope. That helps maintainers accept just the things they are fine with.</p><p>After that, you will already have a bunch of contributions to open source. From there on it’s a lot easier.</p><p>But that’s a good way to get started.</p><h3>The newbie advantage</h3><p>If you are a “newbie” with that open source tool, if you are just getting started with it, you have a great advantage over the maintainers or any other more experienced developer that uses that tool. And it’s exactly that. That you are a <em>newbie</em>… you know nothing about that tool.</p><p>That means that you have a completely fresh point of view. You don’t take anything for granted with that tool. You don’t assume as obvious some basic concepts of the tool because you weren’t the same person developing it for months, knowing all its internal workings.</p><p>Also, you are actually <em>reading</em> the docs. And you are reading them in order. So, you can more easily spot things that could have been explained before or better. You can find inconsistencies in different parts.</p><p>And you won’t be reading it “like you know it”, it won’t be the 30th time you read those same docs, it will be the first. So, you will more easily be able to detect incorrect things, typos, etc.</p><p>A maintainer, an experienced user, or a contributor won’t normally spend much time reading the docs again and again, so, those “small” errors accumulate.</p><p>And a maintainer or the project creator will never have a <em>fresh</em> point of view, without all the background of having built the whole tool.</p><p>A newbie has a fresh point of view. And that’s an advantage.</p><p>Start with docs!</p><h3>About me</h3><p>You can follow me, contact me, ask questions, see what I do, or use my open source code:</p><ul><li><a href="https://github.com/tiangolo">GitHub</a></li><li><a href="https://twitter.com/tiangolo">Twitter</a></li><li><a href="https://www.linkedin.com/in/tiangolo/">Linkedin</a></li><li><a href="https://dev.to/tiangolo">Dev.to</a></li><li><a href="https://medium.com/@tiangolo">Medium</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=49488f3ad6e" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Introducing FastAPI]]></title>
            <link>https://tiangolo.medium.com/introducing-fastapi-fdc1206d453f?source=rss-963974981597------2</link>
            <guid isPermaLink="false">https://medium.com/p/fdc1206d453f</guid>
            <category><![CDATA[open-api]]></category>
            <category><![CDATA[web]]></category>
            <category><![CDATA[python]]></category>
            <category><![CDATA[framework]]></category>
            <category><![CDATA[api]]></category>
            <dc:creator><![CDATA[Sebastián Ramírez]]></dc:creator>
            <pubDate>Mon, 04 Feb 2019 11:25:33 GMT</pubDate>
            <atom:updated>2019-02-04T11:25:51.082Z</atom:updated>
            <content:encoded><![CDATA[<blockquote><em>FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints.</em></blockquote><p>This article lives in:</p><ul><li><a href="https://medium.com/@tiangolo/introducing-fastapi-fdc1206d453f">Medium</a></li><li><a href="https://github.com/tiangolo/medium-posts/tree/master/introducing-fastapi">GitHub</a></li><li><a href="https://fastapi.tiangolo.com/">FastAPI (original documentation)</a></li></ul><h3>Intro</h3><p>I have been avoiding the creation of a new framework for several years. First I tried to solve all the features covered by FastAPI using many different frameworks, plug-ins, and tools.</p><p>But at some point, there was no other option than creating something that provided all these features, taking the best ideas from previous tools, and combining them in the best way possible, using language features that weren’t even available before (Python 3.6+ type hints).</p><p>Documentation: <a href="https://fastapi.tiangolo.com/">https://fastapi.tiangolo.com</a></p><p>Source Code: <a href="https://github.com/tiangolo/fastapi">https://github.com/tiangolo/fastapi</a></p><h3>Key Features</h3><ul><li>Fast: Very high performance, on par with NodeJS and Go (thanks to Starlette and Pydantic). <a href="https://github.com/tiangolo/medium-posts/tree/master/introducing-fastapi#performance">One of the fastest Python frameworks available</a>.</li><li>Fast to code: Increase the speed to develop features by about 200% to 300% *.</li><li>Less bugs: Reduce about 40% of human (developer) induced errors. *</li><li>Intuitive: Great editor support. Completion everywhere. Less time debugging.</li><li>Easy: Designed to be easy to use and learn. Less time reading docs.</li><li>Short: Minimize code duplication. Multiple features from each parameter declaration. Less bugs.</li><li>Robust: Get production-ready code. With automatic interactive documentation.</li><li>Standards-based: Based on (and fully compatible with) the open standards for APIs: <a href="https://github.com/OAI/OpenAPI-Specification">OpenAPI</a> (previously known as Swagger) and <a href="http://json-schema.org/">JSON Schema</a>.</li></ul><p>* Estimation based on tests on an internal development team, building production applications.</p><h3>Installation</h3><pre>$ pip install fastapi</pre><p>You will also need an ASGI server, for production such as <a href="http://www.uvicorn.org/">uvicorn</a>.</p><pre>$ pip install uvicorn</pre><h3>Example</h3><h3>Create it</h3><ul><li>Create a file main.py with:</li></ul><pre>from fastapi import FastAPI</pre><pre>app = FastAPI()<br></pre><pre>@app.get(&quot;/&quot;)<br>def read_root():<br>    return {&quot;Hello&quot;: &quot;World&quot;}<br></pre><pre>@app.get(&quot;/items/{item_id}&quot;)<br>def read_item(item_id: int, q: str = None):<br>    return {&quot;item_id&quot;: item_id, &quot;q&quot;: q}</pre><p><a href="https://fastapi.tiangolo.com/async/#in-a-hurry">Or use </a><a href="https://fastapi.tiangolo.com/async/#in-a-hurry">async def</a>...</p><h3>Check it</h3><p>Open your browser at <a href="http://127.0.0.1:8000/items/5?q=somequery">http://127.0.0.1:8000/items/5?q=somequery</a>.</p><p>You will see the JSON response as:</p><pre>{&quot;item_id&quot;: 5, &quot;q&quot;: &quot;somequery&quot;}</pre><p>You already created an API that:</p><ul><li>Receives HTTP requests in the <em>paths</em> / and /items/{item_id}.</li><li>Both <em>paths</em> take GET <em>operations</em> (also known as HTTP <em>methods</em>).</li><li>The <em>path</em> /items/{item_id} has a <em>path parameter</em> item_id that should be an int.</li><li>The <em>path</em> /items/{item_id} has an optional str <em>query parameter</em> q.</li></ul><h3>Interactive API docs</h3><p>Now go to <a href="http://127.0.0.1:8000/docs">http://127.0.0.1:8000/docs</a>.</p><p>You will see the automatic interactive API documentation (provided by <a href="https://github.com/swagger-api/swagger-ui">Swagger UI</a>):</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/960/0*3J41M5I1oh9yrTtj" /></figure><h3>Alternative API docs</h3><p>And now, go to <a href="http://127.0.0.1:8000/redoc">http://127.0.0.1:8000/redoc</a>.</p><p>You will see the alternative automatic documentation (provided by <a href="https://github.com/Rebilly/ReDoc">ReDoc</a>):</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/960/0*d_Wv9Z6VS1M45ZpF" /></figure><h3>Example upgrade</h3><p>Now modify the file main.py to receive a body from a PUT request.</p><p>Declare the body using standard Python types, thanks to Pydantic.</p><pre>from fastapi import FastAPI<br>from pydantic import BaseModel</pre><pre>app = FastAPI()<br></pre><pre>class Item(BaseModel):<br>    name: str<br>    price: float<br>    is_offer: bool = None<br></pre><pre>@app.get(&quot;/&quot;)<br>def read_root():<br>    return {&quot;Hello&quot;: &quot;World&quot;}<br></pre><pre>@app.get(&quot;/items/{item_id}&quot;)<br>def read_item(item_id: int, q: str = None):<br>    return {&quot;item_id&quot;: item_id, &quot;q&quot;: q}<br></pre><pre>@app.put(&quot;/items/{item_id}&quot;)<br>def create_item(item_id: int, item: Item):<br>    return {&quot;item_name&quot;: item.name, &quot;item_id&quot;: item_id}</pre><p>The server should reload automatically (because you added --debug to the uvicorn command above).</p><h3>Interactive API docs upgrade</h3><p>Now go to <a href="http://127.0.0.1:8000/docs">http://127.0.0.1:8000/docs</a>.</p><ul><li>The interactive API documentation will be automatically updated, including the new body:</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/960/0*bhu89-FvXVsLhtQd" /></figure><ul><li>Click on the button “Try it out”, it allows you to fill the parameters and directly interact with the API:</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/960/0*8pDMBk7uM-tHEPNP" /></figure><ul><li>Then click on the “Execute” button, the user interface will communicate with your API, send the parameters, get the results and show them on the screen:</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/960/0*66EbtbUHYOPfH6LU" /></figure><h3>Alternative API docs upgrade</h3><p>And now, go to <a href="http://127.0.0.1:8000/redoc">http://127.0.0.1:8000/redoc</a>.</p><ul><li>The alternative documentation will also reflect the new query parameter and body:</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/960/0*PNPj_svvgTeWLWAh" /></figure><h3>Recap</h3><p>In summary, you declare once the types of parameters, body, etc. as function parameters.</p><p>You do that with standard modern Python types.</p><p>You don’t have to learn a new syntax, the methods or classes of a specific library, etc.</p><p>Just standard Python 3.6+.</p><p>For example, for an int:</p><pre>item_id: int</pre><p>or for a more complex Item model:</p><pre>item: Item</pre><p>…and with that single declaration you get:</p><ul><li>Editor support, including:</li><li>Completion.</li><li>Type checks.</li><li>Validation of data:</li><li>Automatic and clear errors when the data is invalid.</li><li>Validation even for deeply nested JSON objects.</li><li>Conversion of input data: coming from the network to Python data and types. Reading from:</li><li>JSON.</li><li>Path parameters.</li><li>Query parameters.</li><li>Cookies.</li><li>Headers.</li><li>Forms.</li><li>Files.</li><li>Conversion of output data: converting from Python data and types to network data (as JSON):</li><li>Convert Python types (str, int, float, bool, list, etc).</li><li>datetime objects.</li><li>UUID objects.</li><li>Database models.</li><li>…and many more.</li><li>Automatic interactive API documentation, including 2 alternative user interfaces:</li><li>Swagger UI.</li><li>ReDoc.</li></ul><p>Coming back to the previous code example, FastAPI will:</p><ul><li>Validate that there is an item_id in the path for GET and PUT requests.</li><li>Validate that the item_id is of type int for GET and PUT requests.</li><li>If it is not, the client will see a useful, clear error.</li><li>Check if there is an optional query parameter named q (as in http://127.0.0.1:8000/items/foo?q=somequery) for GET requests.</li><li>As the q parameter is declared with = None, it is optional.</li><li>Without the None it would be required (as is the body in the case with PUT).</li><li>For PUT requests to /items/{item_id}, Read the body as JSON:</li><li>Check that it has a required attribute name that should be a str.</li><li>Check that is has a required attribute price that has to be a float.</li><li>Check that it has an optional attribute is_offer, that should be a bool, if present.</li><li>All this would also work for deeply nested JSON objects.</li><li>Convert from and to JSON automatically.</li><li>Document everything with OpenAPI, that can be used by:</li><li>Interactive documentation systems.</li><li>Automatic client code generation systems, for many languages.</li><li>Provide 2 interactive documentation web interfaces directly.</li></ul><p>We just scratched the surface, but you already get the idea of how it all works.</p><p>Try changing the line with:</p><pre>return {&quot;item_name&quot;: item.name, &quot;item_id&quot;: item_id}</pre><p>…from:</p><pre>... &quot;item_name&quot;: item.name ...</pre><p>…to:</p><pre>... &quot;item_price&quot;: item.price ...</pre><p>…and see how your editor will auto-complete the attributes and know their types:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*F-xpMQPqabrAgiZc" /></figure><p>For a more complete example including more features, see the <a href="https://fastapi.tiangolo.com/tutorial/intro/">Tutorial — User Guide</a>.</p><p>Spoiler alert: the tutorial — user guide includes:</p><ul><li>Declaration of parameters from other different places as: headers, cookies, form fields and files.</li><li>How to set validation constraints as maximum_length or regex.</li><li>A very powerful and easy to use Dependency Injection system.</li><li>Security and authentication, including support for OAuth2 with JWT tokens and HTTP Basic auth.</li><li>More advanced (but equally easy) techniques for declaring deeply nested JSON models (thanks to Pydantic).</li><li>Many extra features (thanks to Starlette) as:</li><li>WebSockets</li><li>GraphQL</li><li>extremely easy tests based on requests and pytest</li><li>CORS</li><li>Cookie Sessions</li><li>…and more.</li></ul><h3>Performance</h3><p>Independent TechEmpower benchmarks show FastAPI applications running under Uvicorn as <a href="https://www.techempower.com/benchmarks/#section=test&amp;runid=a979de55-980d-4721-a46f-77298b3f3923&amp;hw=ph&amp;test=fortune&amp;l=zijzen-7">one of the fastest Python frameworks available</a>, only below Starlette and Uvicorn themselves (used internally by FastAPI). (*)</p><p>To understand more about it, see the section <a href="https://fastapi.tiangolo.com/benchmarks/">Benchmarks</a>.</p><h3>Learn more</h3><p>Documentation: <a href="https://fastapi.tiangolo.com/">https://fastapi.tiangolo.com</a></p><p>Source Code: <a href="https://github.com/tiangolo/fastapi">https://github.com/tiangolo/fastapi</a></p><h3>About me</h3><p>You can follow me, contact me, ask questions, see what I do, or use my open source code:</p><ul><li><a href="https://github.com/tiangolo">GitHub</a></li><li><a href="https://twitter.com/tiangolo">Twitter</a></li><li><a href="https://www.linkedin.com/in/tiangolo/">Linkedin</a></li><li><a href="https://medium.com/@tiangolo">Medium</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=fdc1206d453f" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Docker Swarm with Swarmprom for real-time monitoring and alerts]]></title>
            <link>https://tiangolo.medium.com/docker-swarm-with-swarmprom-for-real-time-monitoring-and-alerts-282da7890698?source=rss-963974981597------2</link>
            <guid isPermaLink="false">https://medium.com/p/282da7890698</guid>
            <category><![CDATA[https]]></category>
            <category><![CDATA[traefik]]></category>
            <category><![CDATA[docker]]></category>
            <category><![CDATA[grafana]]></category>
            <category><![CDATA[prometheus]]></category>
            <dc:creator><![CDATA[Sebastián Ramírez]]></dc:creator>
            <pubDate>Sat, 19 Jan 2019 14:20:17 GMT</pubDate>
            <atom:updated>2019-01-30T05:51:22.510Z</atom:updated>
            <content:encoded><![CDATA[<p>This article lives in:</p><ul><li><a href="https://medium.com/@tiangolo/docker-swarm-with-swarmprom-for-real-time-monitoring-and-alerts-282da7890698">Medium</a></li><li><a href="https://github.com/tiangolo/medium-posts/tree/master/docker-swarm-with-swarmprom-for-real-time-monitoring-and-alerts">GitHub</a></li><li><a href="https://dockerswarm.rocks/swarmprom/">DockerSwarm.rocks</a></li></ul><h3>Intro</h3><p>Let’s say you already set up a Docker Swarm mode cluster as described in <a href="https://dockerswarm.rocks/">DockerSwarm.rocks</a>, with a <a href="https://dockerswarm.rocks/traefik/">Traefik distributed HTTPS proxy</a>.</p><p>Here’s how you can set up <a href="https://github.com/stefanprodan/swarmprom">Swarmprom</a> to monitor your cluster.</p><p>It will allow you to:</p><ul><li>Monitor CPU, disk, memory usage, etc.</li><li>Monitor it all per node, per service, per container, etc.</li><li>Have a nice, interactive, real-time dashboard with all the data nicely plotted.</li><li>Trigger alerts (for example, in Slack, Rocket.chat, etc) when your services/nodes pass certain thresholds.</li><li>And more…</li></ul><p>Swarmprom is actually just a set of tools pre-configured in a smart way for a Docker Swarm cluster.</p><p>It includes:</p><ul><li><a href="https://prometheus.io/">Prometheus</a></li><li><a href="https://grafana.com/">Grafana</a></li><li><a href="https://github.com/google/cadvisor">cAdvisor</a></li><li><a href="https://github.com/prometheus/node_exporter">Node Exporter</a></li><li><a href="https://github.com/prometheus/alertmanager">Alert Manager</a></li><li><a href="https://github.com/cloudflare/unsee">Unsee</a></li></ul><p>Here’s how it looks like:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*y4o3pTgpjTl1hfD-" /></figure><h3>Instructions</h3><ul><li>Clone Swarmprom repository and enter into the directory:</li></ul><pre>$ git clone <a href="https://github.com/stefanprodan/swarmprom.git">https://github.com/stefanprodan/swarmprom.git</a><br>$ cd swarmprom</pre><ul><li>Set and export an ADMIN_USER environment variable:</li></ul><pre>export ADMIN_USER=admin</pre><ul><li>Set and export an ADMIN_PASSWORD environment variable:</li></ul><pre>export ADMIN_PASSWORD=changethis</pre><ul><li>Set and export a hashed version of the ADMIN_PASSWORD using openssl, it will be used by Traefik&#39;s HTTP Basic Auth for most of the services:</li></ul><pre>export HASHED_PASSWORD=$(openssl passwd -apr1 $ADMIN_PASSWORD)</pre><ul><li>You can check the contents with:</li></ul><pre>echo $HASHED_PASSWORD</pre><p>it will look like:</p><pre>$apr1$89eqM5Ro$CxaFELthUKV21DpI3UTQO.</pre><ul><li>Create and export an environment variable DOMAIN, e.g.:</li></ul><pre>export DOMAIN=example.com</pre><p>and make sure that the following sub-domains point to your Docker Swarm cluster IPs:</p><ul><li>grafana.example.com</li><li>alertmanager.example.com</li><li>unsee.example.com</li><li>prometheus.example.com</li></ul><p>(and replace example.com with your actual domain).</p><p>Note: You can also use a subdomain, like swarmprom.example.com. Just make sure that the subdomains point to (at least one of) your cluster IPs. Or set up a wildcard subdomain (*).</p><ul><li>Set and export an environment variable with the tag used by Traefik public to filter services (by default, it’s traefik-public):</li></ul><pre>export TRAEFIK_PUBLIC_TAG=traefik-public</pre><ul><li>If you are using Slack and want to integrate it, set the following environment variables:</li></ul><pre>export SLACK_URL=https://hooks.slack.com/services/TOKEN<br>export SLACK_CHANNEL=devops-alerts<br>export SLACK_USER=alertmanager</pre><p>Note: by using export when declaring all the environment variables above, the next command will be able to use them.</p><ul><li>Deploy the Traefik version of the stack:</li></ul><pre>docker stack deploy -c docker-compose.traefik.yml swarmprom</pre><p>To test it, go to each URL:</p><ul><li><a href="https://grafana.example.com">https://grafana.example.com</a></li><li><a href="https://alertmanager.example.com">https://alertmanager.example.com</a></li><li><a href="https://unsee.example.com">https://unsee.example.com</a></li><li><a href="https://prometheus.example.com">https://prometheus.example.com</a></li></ul><h3>About me</h3><p>You can follow me, contact me, ask questions, see what I do, or use my open source code:</p><ul><li><a href="https://github.com/tiangolo">GitHub</a></li><li><a href="https://twitter.com/tiangolo">Twitter</a></li><li><a href="https://www.linkedin.com/in/tiangolo/">Linkedin</a></li><li><a href="https://medium.com/@tiangolo">Medium</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=282da7890698" width="1" height="1" alt="">]]></content:encoded>
        </item>
    </channel>
</rss>