In the strict sense of the word, TRUE randomness means "continuing to positively maintain the ongoing assertion of equivalent probability". Of course, as was mentioned, this very strict definition of randomness is inherently deterministic. Furthermore, there is no real world example that I can think to cite because the real world is not like true randomness even though there are a few cases which closely approximate it. Most people refer to randomness loosely in an inexact fashion which doesn't exactly help the matter of what they are trying to debate. A pseudorandom number generator never produces an unpredictable result, by the way. A pseudorandom number generator only produces results that when viewed aggregately can be said to be random. For instance, over binary range, 0101 and 1010 are both truly random sequences, whereas 0011 and 1100 are only aggregately random. The truly random samples "continue to positively maintain the ongoing assertion of equivalent probability" whereas the aggregately random samples only succeed in achieving quantitative parity with "the ongoing assertion of equivalent probability" after n selections.
Most of that doesn't even make sense.
I see no way that 0101 and 1010 can be said to be random, if they are part of a simple alternating sequence. They would both then be completely predictable (i.e. non-random).
You also seem to be confused or mistaken about the difference between conditional probability and independent probability. A fairly shuffled deck of cards naturally exhibits the former, while a fair die roll or coin flip naturally exhibits the latter.
A deck of cards "has memory" or exhibits conditional probability because cards are shown and then removed from the deck. A shown card can never appear again. The unshown remainder of the deck then has 0 probability of producing the card that was just shown, and 1/(N-1) probability of the next card being a particular one of the unshown cards, where N is the number of cards before one was shown.
For example, suppose a 52-card deck is shuffled and one card is shown. It's the 2 of hearts. The probability of 2 of hearts appearing as the next card is 0. The probability of any specific other card appearing next is 1/51. Eventually there are two cards remaining, and you know exactly which two because you already know every other card that was shown. What you don't know is the order, or which card of the two will appear next. Suppose the last two cards are 3 of clubs and 5 of spades. At this point the probability of 3 of clubs being shown next are 1/2. The same probability applies to 5 of spades. The probability of any other card appearing is 0. Once the next card is shown, you can then predict 100% what the last card is. If that's your definition of deterministic, you need to take a basic statistics class.
A fair die or coin flip has no memory: each roll or flip is independent of what came before.
A random stream of bits is simply a stream where you can't predict with better than 50% probability what the next bit will be, even if you know the entire history of prior bits. Knowing the entire history tells you nothing about the bits not yet seen. This is independent probability. Compare that to the conditional probability exhibited by the shuffled deck of cards, where each card shown (i.e. the history of prior cards) DOES affect the probability of the next card to appear.
If the cards were infinite, or more precisely an infinite number of fairly shuffled fair decks, then every card is equiprobable at every point, even if you know the entire history of dealt cards. That's because there is an unending supply of every possible card, so knowing the history doesn't tell you anything about the cards not yet seen. You can't even do card-counting at levels better than chance.