That would largely depend on the app. But yes it's possible. I try to avoid that though.
*sigh*
Scaling happens every time you display an image (or video) at its non-native resolution.
If you're watching 1080p content on a MacBook full-screen, then it's being scaled, because there are no MacBooks with 1080p displays. This is unavoidable.
There are different algorithms for scaling. The oldest, simplest, fastest, and worst algorithm is to drop or duplicate the appropriate number of scan lines. This will lead to obvious pixelation if your content is very low resolution.
The good news is that nothing really does that anymore because good scaling is handled by GPUs these days and is fast and cheap. It's called resampling. The way it works is to treat an image as a continuous field of color. The pixels of the content correspond to points in this field and specify the colors at these points. Think of a wooden board with a grid of closely-spaced nails in it (pixels) and they have different heights (colors) and you drape a blanket over it. To display the image, the field is divided into a grid matching the pixels of the output image and each square of the grid is 'sampled.' In the analogy, the average height of the blanket for that square is calculated. Since it's a continuous, smooth field, there are no hard pixel edges, i.e., there is literally no pixelation to be seen.
Basically, this means if what you're seeing seems to be low quality, that's because the original content is low quality and it has nothing to do with how the image is being scaled up or down or with the native resolution of your screen or whether or not the native resolution matches the content resolution.
The other part of this equation is that graphic designers often design static content to line up with pixel boundaries. You might have an argument that such content wouldn't look as good when scaled up or down because it would no longer line up with those pixel boundaries.
But if you're talking about video, it's not like people shoot video in order for their content to line up on pixel boundaries, that would be impossible and the idea is ridiculous. Already you're looking at a pixelated representation of something that doesn't have pixel boundaries, so even if you're looking at content in its "native" resolution, it's still pixelated. Scaling such an image up or down isn't going to make the content of the image align to pixel boundaries any better or worse.
The other issue with video is that you've got much bigger problems than any perceived "pixelation." If you pause a video and blow it up, you can see that it's full of video compression artifacts. Frankly any video looks like absolute crap if you look at it on a per-pixel basis. But everything moves around so fast that you can't possibly tell. And if your brain can't figure out that a 16x16 block of pixels is all the same color when it should actually be a smooth gradient, then what do you care if there's a tiny, tiny inaccuracy in the colors of a 2x2 block of pixels that's the result of content pixels not lining up exactly right with output pixels?
Basically, your idea video looks better when displayed at its native resolution in a way that a human being would be able to notice is not correct from a scientific, technical standpoint for several different reasons.
In other words, buy whatever MacBook you want with whatever screen resolution you want and watch whatever content you want and it will look good. Or as good as the quality of the original content, anyway.