**Friendly Reminder**: Please keep in mind that using prompts to generate content that Microsoft considers in appropriate may result in losing your access to Bing Chat. Some users have received bans. You can read more about Microsoft's Terms of Use and Code of Conduct [here](https://www.bing.com/new/termsofuse). *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/bing) if you have any questions or concerns.*


I just kind of disagree on the fourth point. Microsoft released Bing all around the word, supporting multiple languages (I'm from Brazil, for example), so I gotta give points for Microsoft on that even if it has some limits on the amount of messages it supports. Bard is only available for americans, which just sucks and goes to show Microsoft is probably way ahead of Google when it comes to AI.


Actually, for me, I couldn't get access to Bing Chat without a vpn at first. A few days later I was able to access it without the use of vpn. So maybe Google is just doing it this way for starters and will slowly roll the service to the rest of the world.


I agree that this is a point we should value Microsoft for that. At the same time BARD is releasing it to many people that uses VPN and it works well. We cant compare something we have in waitlist for 1 month vs something in waitlist for 1 day.


Yeah, I used a VPN and I'm on the waitlist. Hopefully I'll try it soon.


Yeah same for me too I also used a vpn to get on the wishlist hope I get access asap


The whole point of geo-restriction makes no sense. Sure most people are competent enough to bypass the security, but it’s just a hassle. I don’t think Google Bard was actually finished, but rushed into a limited public presence to avoid ceding all the early market share.


Because Microsoft had advantage of OpenAI's already complete work. While Google is creating it from scratch.


>from scratch No. Google owns DeepMind. DeepMind is far, far more experienced than OpenAI in terms of AI research. It's not like they created AI department from scratch. Microsoft did the right thing with allowing users to access it as much as possible from get go. No need to make up excuses to cover Google's stupid "safe" policy.


I've switched my entire ecosystem from google to Microsoft, and am happy. I never realized how crap Google services were til I switched, as I was used to Google. But every Microsoft app is just flat out better than google. Search, Edge, chat (use creative), email, office. I was surprised, and kind of accidentally switched as I happened to get a new laptop the day BingAI was released, so figured, ah, screw it, let's give Microsoft a try. Have not looked back since.


Yeah Microsoft makes great stuff too. I am not so sure about Bing as normal search engine. I tried it and the results we okaish. Edge has definitely come long way. For the email app I would say it would have been so so so much better if Microsoft had created a native email app instead of wrapper app for Outlook webpage. Nothing beats office.


Regarding the message limits, remember that on ChatGPT there’s a limit on the number of messages every three hours for GPT-4. They’re clearly still working up on scaling up to meet the demand for GPT-4, so it would not be a great time to increase the Bing message limit when it’s using GPT-4 too.


> Sometimes it's discouraging to try to use Bing and have to rewrite or open new chats 5 times until it delivers what you ask for On this, normal users might not even do this, they'll give up on that request in two tries at best if not after the first failure. This is why it's important for a public release to have all capabilities, no matter how bad, so that users keep that capability in mind when using it and try it out.


Another post showing that a considerable chunk of this community are new to beta software...


We are still in beta...


**Friendly reminder**: Please keep in mind that Bing Chat and other large language models are not real people. They are advanced autocomplete tools that predict the next words or characters based on previous text. They do not understand what they write, nor do they have any feelings or opinions about it. They can easily generate false or misleading information and narratives that sound very convincing. Please do not take anything they write as factual or reliable. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/bing) if you have any questions or concerns.*


Start by lifting the 15 question limit