Topic: The everything Xbox thread

Posts 11,681 to 11,700 of 11,955



I don't remotely trust Kotaku's (or Jason Schreir's) ability to report on something like this honestly. They have very clear poisitions of their own that they're trying to push (creating unions)
Maybe that happens to be a fairly accurate portrayal of what's going on at Naughty Dog. It's certainly possible. Or maybe they only talk to the staff who confirm their own beliefs, and ignore anyone who says they're fine with the amount of overtime and love working there.
There's simply no way of knowing without more information. But that information can't be just a series of individual anecdotes. You need representative data on the opinions of staff there if you're going to understand it. We don't have that, and "journalists" often don't even bother trying to get it.

It's dangerous to go alone! Stay at home.


The moment crunch is mentioned Jason Schreier pops up somewhere. So he's never going to report on crunch without bias. For example, he specifically mentioned 70% of non-lead designers, which is an oddly specific category, and probably the highest, to prove his point.

14 out of 20 non-lead designers, so not even out of all designers. I have no idea how fast the turnover rate is, but I don't think this is out of the ordinary. This is just my hunch though. Like @Dezzy said, so many people are only hired for one game, so it's no surprise they are no longer with the company anymore. Some of these positions are very flexible, and people move between companies a lot.

How many of the total staff has left ND in the last 10 years? And how much is the average for other big studios? Those are the numbers I'd like to know before drawing a conclusion. If this isn't sustainable, they will have to change. It seems like it apparently isn't that bad, because this has been the situation at ND for a while (U4 did have crunch as well IIRC), and the quality of their games haven't suffered either IMO. I'm personally not really worried at the moment.



Precisely what @Dezzy writes above.

“A more senior team would’ve shipped TLOU2 a year ago”

But nobody manages, and having played the game, the incredible level of polish tells me this person is likely just vindictive. Which is fine — nobody should be loyal to a corporate entity. But it’s not the truth

Switch FC: SW-4802-1501-8621


It's sad when some dismiss claims and assume that people that actually work at a company have an agenda when their words have been objectively reported by Kotaku, Eurogamer and themselves on Twitter. I hope that this never happens to you and people call you liars on top of that.




I'm not assuming the people who work at the companies have an agenda. Nor am I saying anyone is necessarily lying. What I'm saying is we don't have enough information to know if these accounts represent the opinions of the general staff at these companies or not.

It's entirely possible that 90% of the staff at Naughty Dog hate working there and feel like it's an incredibly exploitative company in how it operates, in which case these accounts DO represent the general sentiment.

BUT it's also possible that less than 10% of the staff have a massive problem with it, and people like Kotaku and Schreir just deliberately go and find everyone in that 10% and report on all of those opinions, presenting them as if they represent the whole team (given that it's a huge company, 10% of the total staff is still a big enough number of anecdotes to turn into a story).

All I'm saying is we have no way of knowing which of these is closer to the truth. But the fact that Jason Schreir seems to have blocked about 2/3rds of the entire games industry on social media, makes me very suspicious. That's a classic way of biasing your own data pool, and not something a "journalist" should ever be doing.

It's dangerous to go alone! Stay at home.


They don't talk about their personal experience but about the whole team, policies, working hours for everybody, receiving good-bye emails every week, etc. and it's not just one worker but many and all of them talk about general practice, are they all lying about everybody?

Another article:



Why is the xbox thread all about naughty dog? . Much as i enjoy uncharted for some unknown reason, I've never actually thought of the creators of the platformer you run toward the screen at particularly great...

@Ryu_Niiyama nono, i enjoy the animation. It makes it easier to find your replies in the longer threads and is part of the nl experience!



So I have a question. I am getting a tablet this weekend and I was wondering with Xcloud can I log into my Gamepass account and play a game on my tablet while someone else uses the Xbox One?

RetiredPush Square Moderator and all around retro gamer.

My Backlog

Nintendo Network ID: Tasuki311


@BlueOcean Just replying here to your post on PXB simply because the comments system doesn't let you post long posts and I can't trim it in any reasonable way there, so I'll post it here so I don't have to delete and format it (I seriously want to slap Ant for implementing that darned comment length limit. It's an annoyance in almost every thread. And I'm still convinced it was implemented explicitly because of me. )

@BlueOcean Yeah, I've seen that, though I think the kind of statements in that regard you get in interviews and the like we're seeing in pre-launch type media coverage is probably not worth that much, it's just finding the people who do think so and posting on it.

What's interesting about those other discussions though is they're separated from that sort of media coverage and offer a more "on the ground" sense of the water cooler chat (or Zoom-based, socially distanced water cooler chat?) around the studio of what's being heard (by people who's job it is to research know, before they're handed the kit.)

Doesn't mean it's bullet proof conversation. The one guy is in art (but needs to know the new graphics pipeline and IO since he's apparently the effects buy tasked with implementing raytracing, currently) - From what he's said, he's on all PS projects now (2 PS-only indies, and the PS port team of an AAA) so there may be an echo box there and may be hearing less from the MS guys. For what he's doing, he's favoring how the PS hardware should let him use it.

The other guy seems to be on the engine/code side (different company(s)) he's a bit less certain which way things will go in terms of advantages, but he had a very good pork based analogy (yes, a pork based analogy about a sausage factory he got from another dev at a different company ) somewhere in there for what the major difference is in terms of how to use both systems and what makes the PS one easier to use. Believe it or not, it was a great analogy and I understood exactly what it was saying...makes total sense.

If that's really "all" it is, then the two kits might end up being almost identical hardware equivalents, that just get there in different ways, but with each having different strengths and weaknesses that may manifest differently, unless a project is geared to just the one platform. It just requires more effort on the XS. The main really over-simplified gist of the pork analogy was that XB has more CUs, but the bottleneck is in a comparatively restricted IO for loading them all efficiently enough to take advantage of them all, otherwise you leave some of that capacity unused and keep starving cores with nothing to do, while PS has less cores but each one can process a bit faster, meaning that if you're not efficient at loading up the cores on XB, you're reduced capacity is going to run slower, and feeding the cores takes more hand-holding of the process than PS's setup. The analogy went further, in interesting ways, about "loading the ingredients bins and changing them out versus having to dump the whole mixture and refill it with the new one." - harder to simplify that, but you can get the idea in a nebulous cloud

It sounds like if the XB can efficiently load up the GPU it has some advantages. But it also sounds like due to the ease of use of the toolkit, it's just easier to maximize the use of the hardware of the PS than the XB. If so that probably translates to 1st party games, if MS can really really use their hardware to the max, they can do more with it. 3rd party games are a wash depending on the implementation, I think some will work better on PS5 because the ease of use lets them do things efficiently without having to work hard to do it compared to XB, and I think some will work better on XSeX because if they do optimize it there's more power to tap. What I think those guys are seeing though is that the way they'll be implementing things in the future, XSeX's additional power starts mattering less, because the GPU scalability isn't their primary bottleneck on either.

The one thing they both more or less agree on, though, is that the early games, the cross gen games, will probably see an advantage on the XSeX - it sounds like it's designed for the way games have been designed up until now, and that design is advantageous for those games (and BC.) It's the new techniques the engines will be switching to over the next few years that work more in favor of how the 30xx cards and PS5 work where they're seeing PS picking up some advantages and XX being harder to implement the same results.

The other thing they (and a few others in the industry chimed in with) all agree on is they really, really, really dislike DF. And they dislike how it's become a standard for even experienced enthusiasts to use in console wars, because it sounds like what DF does in an attempt to simplify things for the masses is simplify too far and end up providing info that's just wrong, or misses how something is to be used. DF is great for benchmarks, but it sounds like their tech explanations may not be much to rely on (it's not that they lie, it's that they're simplifying in a way that takes shortcuts on specifics that ends up making the result inaccurate - and then the internet runs with it.) I've only really ever used DF for their benchmarks, which are good, not so much their tech explanations, but, yeah, it sounds like DF isn't a great place to rely on for really solid explanations (the benchmarks are solid though.)

So it will be interesting. If it works, even along the way those guys are seeing the games coming out now and for the next year or two are probably going to be overall better on XSeX. If MS can capitalize on that time to reinforce consumer perception it's "better" - that could swing the sales balance toward them, and if the sales balance swings toward them, studios will optimize for them more thoroughly in the future. As-is the studios expect PS5 to win in sales, and if that's "Most of the market" and easier/cheaper to implement, they'll focus on that and pay less attention to the XB version which could end up making PS5 the better performer getting into mid-generation. But also, if MS can bolster their toolkit to automate some of that "hand-holding" process so devs can tap it's strengths as easily as PS5's interface, then that could negate that gap as well. Only thing is MS would have to automate in software some of what Sony's automating in hardware.

The only other thing the one guy was more uncertain of was the GPU clocks on XS. He was briefed on the navi architecture (sounds more like it was for the PC side of things, maybe - obviously NDAs and such we're left guessing some of what they're really working with) and pointed out that most of the shaders are meant to run at the clocks the PS5 runs at (in the OC mode, but he's also the one that kept asserting that it's meant to always run in OC mode, when nobody listened to him.) while the XSeX's lower clocks are technically below what the new shaders are meant to run at. He's in the dark as to exactly how that affects anything though, it was just a technical observation that he noted.)

As for cooling though, that also explains the absurdly overbuilt cooling on the PS and how XSeX can remain cool with its design. The hardware just runs cooler at its native clocks, while PS5 is forcing extreme overclock effectively, full-time (sounds like the "variable clocks" are really about power conservation at idle and thermal management more than a "boost" like the DF explanation made it sound like) like an extreme PC.

BUT this is all from a time before most staff even has a dev kit yet, so it's all from the technical documents and seeing some proof in the output of the teams that do have one, not so much hands-on. We'll only get that at actual launch.

But even with their analysis, I think it's safe to bet we'll see XSeX pull ahead with the launch games and most games over at least the first year. It's in a year or even two when the engines aren't built around the older hardware designs that if the performance pendulum swings to PS we'll see that effect.

Sounds like it's a lot like WiiU when the design of games was switching from single threaded, CPU-bound architecture to GPGPU. Nothing in game design was designed for the latter so ports all ended up janky even though it was technically better. I think PS5 (and the newer video cards) sound like it'll go that way. XSeX has a strong advantage for the older designs, but as they move deeper into the new design paradigms, PS5 has a few efficiency benefits that kind of "automatically" work at a hardware level, while XSeX relies on more explicit software implementations to accomplish similar things.

The result is, it sounds like there's not a clear "winner" by the end for third party. First party there's more rendering power on XSeX, but it sounds like the IO allows probably more useful headroom for other things on the PS5. For third party it's sounding like it'll be a mixed bag where some games will run better on each depending on the team, their design, and their experience.

And every now and then you get Yakuza that runs like trash on anything.



@Tasuki It should be possible so long as someone else is using your X1 with their account, and it's your "Home XBox" (Primary.) You can use two X1s with the same games as long as someone else is logged into your "Home/Primary" and you're logged in and online on another system. XCloud should be no different. PS4 and Switch also work that way. (I'm hoping beyond hope Sony doesn't break that with PS5, I don't have fear MS will break that with Series.)



Jacksonbrw wrote:

Hello guys I was lucky enough to find this website having the PlayStation 5 and Xbox Series X still in stock. But just in very small quantities, I quickly reserved up my copy for the Xbox series X by pre-ordering from there. You can do that here ;

They even have some games for preorder too. You can get all that ina good bundle. Plus they offer sweet discounts!

Want in on a family group for cheap NSO membership? Email [email protected]

What better way to celebrate than firing something out of the pipe?

Nothing is true. Everything is permitted.

My Nintendo: gcunit | Nintendo Network ID: gcunit


Should we just give him our bank account numbers and Apple passwords now, or wait until buying a Series X through the link?



Apparently the XB1 version of Minecraft always runs like trash so Minecraft would probably be a good test for how much better it runs on Series X. I only have it on 360, got it essentially for free in one of those "buy certain games and get Microsoft Points back" promotions in 2013 (was interested in other games in the promotion) but didn't click with me so didn't upgrade to XB1 version. Kind of like how I don't really use the creation features for the Mario Maker games much.


Switch Friend Code: SW-2595-6790-2897 | 3DS Friend Code: 3926-6300-7087 | Nintendo Network ID: GrumbleVolcano


It's choppy upon loading the game, but I can't say the game runs like trash once you're actually in a world. At least, I don't remember it being bad the last time I played it.

"Sometimes, I just don't understand human behavior" - C-3P0


Can anyone tell me how a 15 year old Java game runs like trash on anything??



Minecraft is a fairly CPU-intensive game, so I expect it'll run beautifully on the new consoles.

What I'm more interested in is how that gorgeous raytracing-enabled demo would run on the Series S.

Switch Lite: Ni no Kuni: Wrath of the White Witch
PC: DOOM 2016 (post-game; cleaning up collectibles)



Yeah Minecraft is doing a hell of a lot more on the CPU side than most games are. That's not so surprising at all. It's only the GPU side that's incredibly basic.

BlueOcean wrote:

and it's not just one worker but many and all of them talk about general practice, are they all lying about everybody?

That account WAS just from 1 person though. He doesn't need to necessarily be lying in order to be misleading. People have different perspectives and different temperaments and tolerances. He might simply be someone who heavily dislikes doing ANY overtime (such people clearly exist, as we've seen from the reaction to the incredibly mild amount of overtime they're doing on Cyberpunk), and he simply projects his own sensitivity onto other people (who wouldn't agree if they were actually asked). We have no way of possibly knowing without more information on the opinions within the company. We'd need an anonymous poll of the whole staff, or something like that. NOT cherry picked anecdotes of a small number of people.

Edited on by Dezzy

It's dangerous to go alone! Stay at home.


@gcunit @Ryu_Niiyama Speaking of not being part of the demographics....why does Every. Single. Game. Ever. have zombies now? Especially Sony games? Almost every game they publish is zombies. Death Stranding? Kinda zombies. Last of Us? Zombies. Days gone? Zombies. Ghost of Tsushima? No zombies.....WHAT NO ZOMBIES? Quick! Patch them into a new multiplayer mode!!! NOW it's got 'em! Uncharted? Occasional zombies. Why is every game zombified?

Anyway, a much briefer-followup to the demographic conversation, no, I really don't think it's age based at all. Again, I think games were more age-targeted, demographically, in the past than now. In the 80's, 90's it was children mostly (we can say "families" but it was children.) 90's and 00's it was angsty teens (and children/"families" for Nintendo), 2010's it's really changed up to 2020 and I think there's really a large age-demographic range of titles, where nostalgia for those 80's and 90's child gamers as adults are actually one of the core demographics being targeted, in addition to other demos.

What I think mostly changed (and where I say "culturing out", though that commentary extends far beyond video games and into the culture in general - anyone who grew up in a way that was even somewhat outside the retroactively apparent "normal" is not going to be a part of the society at all....) But for gaming, specifically, I think the problem is not age based, but now that I think of the way to phrase it - gaming went "mainstream." Where that's conflicting is, most of us existing gamers are the opposite of "mainstream" - we're nerds. Geeks. Dweebs. We were attracted to gaming explicitly or unknowingly due to its' geeky nature of geeky behaviors, geeky focuses, and geeky approaches to everything. The content, the way it was promoted, etc all appealed to a geeky sensibility which is why we were in it - because the rest of the world was clearly not for us, but this was.

But in the past decade gaming has gone truly mainstream. That means the target customer is all those other people that liked all the other things in the world we never understood which is why we played games. So it's taken that thing we understood, fit for nerds. Un-nerdified it, removed the nerdy sensibilities and added sensibilities for people that install "home speakers", watch network TV sportscasts regularly, and know all the pop culture trends. Those other people that aren't us. That we don't understand. Never did understand even when young, and never will understand...because we're nerds. Video game nerds. But video games aren't aimed at the "outsider" nerd demographic anymore. MMOs are.

The target demo is "normies".... We're used to games being based on pen and paper adaptations of D&D with spiral bound rule books and everything in one box. Not "seasons", "updates with exciting new social features to schedule with digital friends", and "Now with 30% more Michael Bay."



You'll have to wait for Baldur's Gate 3 to come to next-gen consoles, @NEStalgia .... That'd probably suit you well!

Gamertag: BruceCM


@NEStalgia I never get the death obsession anyway. Zombies, ghosts, vampires...why? Being alive is dangerous and exciting enough.

I still think marketing follows age and sex demographics. (They also create the fads that tie to said demos but still once it becomes part of culture it becomes a sales data point) but it was more generally targeted the way you described in the past. Targeting teen in the late 90’s usually meant males, and the commercials showed it. Part of why sony never hooked me on the PlayStation.

I do think with all the data collected that they are breaking down those demos to show reality vs perceived social norms (As they aren’t really normal so much as indoctrination) but it doesn’t quite show in marketing yet imo. Gaming stats show more females are gaming than ever but you don’t see marketing showing girls and women playing everything except in family based commercials ,because apparently everyone has one boy and one girl. Not taking a potshot but that makes me wonder what Chinese gaming commercials look like. Cuz of the impact of the old one child (and society hopes it is a male) thing.

It doesn’t follow popular culture the way I expected though. When the superhero boom hit its stride I expected way more games than what we have gotten so far. I’m not a batman fan so that puts me off arkham. But we have all these tv shows and movies...where is the wonder woman game or black lightning or xmen(comic based aside from MUA3 which was funny but not enough) games? We got some licensing stuff in the early fox movie era and some iron man and thor and that one cap game but if it isn’t lego it seems to not be worth making. I love them in fighting games but why not just make an adventure game? Granted adventure games feel like they have gone out of style. Either people are stapling rpg elements on to them or they just don’t show up multiplatform (I greatly enjoyed GoT and once my ssd shows up I will put a dent in spiderman).

I hate it but I think you are on to something about removing the nerd aspects. Not to be mean but i blame the military shooter and the sports sim. Nothing wrong with them existing but as the drowned out everything else and focused more on being realistic I feel like the gamey, fantastical aspects were dropped in gaming.

I remember all the talk about the gun play being improved in FO4 while “streamlining ” the role play. I mean bethesda fallout isn’t that immersive but don’t take more out. (I am aware this is a Todd Howard thing). Granted i don’t need games to be so over complicated like the old d &d bases games but do we have to run to either extreme?

I don’t know what a normie is. I figure a person of average intelligence, height and weight suffices. Because normie varies on perspective. In America I’m a minority but if I packed up and moved to Nigeria, I would find people that look mostly like me (or rather my mom, Cherokee on my dad’s side left me high yellow as the old insult used to go) which would make me a normie. Japanese devs mostly make their games for their own cultural norms but they are consumed all over the world. So I wouldn’t lean on so called normie culture. Also I’m done using that word, I haven’t heard it since a That’s so Raven ep and I find it hilarious that it is apparently a popular term now.

Edit: does anybody know if the series x will take an ssd and their expansion card? I want to have a drive for my old games but I know the in box 1TB will be gone in like 7 games for the new stuff.

Edited on by Ryu_Niiyama

Taiko is good for the soul, Hoisa!
Japanese NNID:RyuNiiyamajp
Team Cupcake! 11/15/14
Team Spree! 4/17/19
I'm a Dream Fighter. Perfume is Love, Perfume is Life.

3DS Friend Code: 3737-9849-8413 | Nintendo Network ID: RyuNiiyama


Please login or sign up to reply to this topic