Vb net validering

Visual Basic 6.0 håndterer validering med to begreper : ' CausesValidation ' og 'Bekreft '. Du trenger Visual Basic 6.0 Vis flere instruksjoner en . begynne et nytt prosjekt ved å velge ' Standard EXE ' mal etter at du åpner Visual Basic ( VB ) . Dette åpner et skjema vindu som heter ' Form1 . Derfor skal de virksomheder, der anvender CPR til validering af kunder, have afviklet dette senest ved udgangen af oktober JyP2013 Morgenavisen Jyllands-Posten (avis), 2013. Rapportér et problem fra Den Danske Ordbog Den Danske Ordbog . ASP.NET MVC 3 provides different ways to build custom validations and in this article we are going to see about couple of ways, using ValidationAttribute and IValidatableObject interface. A sample model. To learn about validations let's create a simple MVC application that helps anyone to host a super.. duper.. party! Browse other questions tagged c# html asp.net-mvc asp.net-membership membership-provider or ask your own question. The Overflow Blog Podcast 270: Oracle tries to Tok, Nvidia Arms up Additional tools ; VIES VAT number validation. You can verify the validity of a VAT number issued by any Member State by selecting that Member State from the drop-down menu provided, and entering the number to be validated. Verification in Software Testing. Verification in Software Testing is a process of checking documents, design, code, and program in order to check if the software has been built according to the requirements or not. The main goal of verification process is to ensure quality of software application, design, architecture etc. ASP.NET MVC: ValidationMessage. Here, you will learn how to display a validation message using the Html.ValidationMessage() extension method in ASP.NET MVC. The Html.ValidationMessage() is an extension method, that is a loosely typed method. vb net kontroll validering hendelsen. Virgens har sex nettsteder 24 7 datingd p perfumum online datingdinamo viitorul constanta online dating. God dating profil for å tiltrekke seg kvinner dating tubely spillcan2cm tilfeldig chat gratis nettstedempresas de investimentos online dating. 0 + 18 + 30 + 20 + 9 + 4 = 81. 81/11 = 7 remainder 4. 11 - 4 = 7 . 7 is therefore the check digit.. PROBLEMS: If the remainder from the division is 0 or 1, then the subtraction will yield a two digit number of either 10 or 11. This won't work, so if the check digit is 10, then X is frequently used as the check digit and if the check digit is 11 then 0 is used as the check digit. Before submitting data to the server, it is important to ensure all required form controls are filled out, in the correct format. This is called client-side form validation, and helps ensure data submitted matches the requirements set forth in the various form controls.This article leads you through basic concepts and examples of client-side form validation.

Path to take??

2020.09.28 18:53 GiantGoose20 Path to take??

Hello all,
I have a Bachelors degree in Earth Sciences and have a strong interest an non work related experience in computers. I've built computers, ran game servers, designed some minor applications in VB.Net once upon a time, etc. I'm currently going through a career change away from Earth Sciences (I'm 27) and would like to expand my horizons beyond my very specific degree and work experience I currently have.
I'm trying to find a way to validate my skills and experience, and my hope is A+ would do that. I am also interested in eventually sitting for the more specialized exams.
I am also interesting in a Masters. I was looking at an MBA program with a specialization in "Information Systems Management". The class has core MBA components, and a few specialized courses such an SQL course, Software Systems Design, etc.
My questions are...would an A+ certification (and possibly others) be something I should consider to bolster my resume to attempt to get into the IT field? Also, would an MBA with some IT specialization be worth much to an employer (I know all employers are different, but seeking opinions) or would it be better to focus on a Masters in IT and forget the MBA?
I'm trying to avoid being pigeonholed to a specific industry as I currently feel I am now.
submitted by GiantGoose20 to CompTIA [link] [comments]


2020.09.24 16:20 WnT_Hilfry Can someone help me,my game wont launch at all

Here is the log file and my system specs, I launch the game and even with admin launch it crashes almost instantly:
GPU AMD Radeon R7 260X overclocked
CPU Intel i7 4770K 3.5GHz
16GB RAM

Log: Log file open, 24/09/2020 16:15:24
Log: GPsyonixBuildID 200917.51558.292065
Log: Command line: -AUTH_LOGIN=unused -AUTH_PASSWORD=6095ed29f2a04246bf02857ab83070b4 -AUTH_TYPE=exchangecode -epicapp=Sugar -epicenv=Prod -EpicPortal -epicusername="Hilfry_" -epicuserid=0a1a42dccecd4e348f3cf34dbf22013f -epiclocale=en
Init: WinSock: version 1.1 (2.2), MaxSocks=32767, MaxUdp=65467
Log: ... running in INSTALLED mode
Init: Language extension: INT
Init: Language extension: INT
DevConfig: GConfig::LoadFile associated file: ..\..\TAGame\Config\TAUI.ini
Init: Version: 200917.51558.292065
Init: Compiled (64-bit): Sep 17 2020 14:53:46
Init: Command line: -AUTH_LOGIN=unused -AUTH_PASSWORD=6095ed29f2a04246bf02857ab83070b4 -AUTH_TYPE=exchangecode -epicapp=Sugar -epicenv=Prod -EpicPortal -epicusername="Hilfry_" -epicuserid=0a1a42dccecd4e348f3cf34dbf22013f -epiclocale=en
Init: Base directory: G:\Epic Games\rocketleague\Binaries\Win64\
[0000.58] Log: Purging 3-day cache '..\..\TAGame\Logs'...
[0000.58] Init: Computer: DESKTOP-2F4FMR1
[0000.58] Init: User: Branchy
[0000.58] Init: CPU Page size=4096, Processors=8
[0000.58] Init: High frequency timer resolution =10.000000 MHz
[0000.58] Init: Memory total: Physical=15.9GB (16GB approx) Pagefile=18.4GB Virtual=131072.0GB
[0000.68] Init: Presizing for 135000 objects not considered by GC, pre-allocating 0 bytes.
[0000.68] Init: Object subsystem initialized
[0000.82] Log: Using feature set PrimeUpdate32
[0000.87] Log: Found D3D11 adapter 0: AMD Radeon R7 200 Series
[0000.87] Log: Adapter has 1000MB of dedicated video memory, 0MB of dedicated system memory, and 8160MB of shared system memory
[0000.91] Log: Found D3D11 adapter 1: Microsoft Basic Render Driver
[0000.91] Log: Adapter has 0MB of dedicated video memory, 0MB of dedicated system memory, and 8160MB of shared system memory
[0000.91] Log: Shader platform (RHI): PC-D3D-SM5
[0006.34] Log: ProductDatabase_TA::UpdateAvailableProducts 0.15 sec total.
[0006.76] Log: 126489 objects as part of root set at end of initial load.
[0006.76] Log: 0 out of 0 bytes used by permanent object pool.
[0006.76] Log: Initializing Engine...
[0006.76] Log: BuildID: -1087928065 from GPsyonixBuildID
[0006.88] SystemSettings: Loading PC Settings
[0007.00] Log: Running hardware survey...
[0007.00] Log: OS: Microsoft Windows 10 Pro (19041)
[0007.00] Log: Wwise(R) SDK Version 2019.1.1 Build 6977. Copyright (c) 2006-2012 Audiokinetic Inc. / All Rights Reserved.
[0007.15] Log: WinSAT: 5.9 [8.9 CPU, 8.2 2D, 9.9 3D, 8.9 Mem, 5.9 Disk]
[0007.15] Log: Processor: Intel(R) Core(TM) i7-4770K CPU @ 3.50GHz (Intel64 Family 6 Model 60 Stepping 3) 4 Cores, 8 Threads
[0007.15] Log: Memory: 4.00GB
[0007.15] Log: Memory: 4.00GB
[0007.15] Log: Memory: 4.00GB
[0007.15] Log: Memory: 4.00GB
[0008.82] Log: VideoController: AMD Radeon R7 200 Series (27.20.12029.1000)
[0008.82] Log: Network Adapter: Intel(R) Ethernet Connection I217-V
[0008.82] Log: Disk C: 337.16GB free of 465.19GB
[0008.82] Log: Disk G: 386.18GB free of 465.75GB
[0008.82] Log: Sound Device: USB Audio Device
[0008.82] Log: Sound Device: High Definition Audio Device
[0008.82] Log: Sound Device: AMD High Definition Audio Device
[0008.82] Log: Sound Device: VB-Audio Virtual Cable
[0008.82] Log: Hardware survey complete in 0.53 seconds.
[0008.82] DevOnline: Created named interface (RecentPlayersList) of type (Engine.OnlineRecentPlayersList)
[0008.82] ScriptLog: PsyNet using environment DBE_Production Prod
[0008.83] PsyNet: PsyNetConnection_X_0 disabled OSCS_ServiceUnavailable
[0008.83] PsyNetStaticData: HandleCacheExpired
[0008.83] PsyNetStaticData: HandleGetURL URL=https://config.psynet.gg/v2/Config/BattleCars/-1087928065/Prod/Epic/INT/
[0008.86] DevOnline: WebRequest_X_0 SEND: https://config.psynet.gg/v2/Config/BattleCars/-1087928065/Prod/Epic/INT/
[0009.07] DevOnline: WebRequest_X_0 RECV: 304
[0009.07] PsyNetStaticData: HandleDataChanged
[0009.08] Log: FJsonStructReader - attempting to overwrite object property 'Class' for struct PresetMutators_X.
[0009.08] Log: FJsonStructReader - attempting to overwrite object property 'Class' for struct PresetMutators_X.
[0009.08] Log: FJsonStructReader - attempting to overwrite object property 'Class' for struct PresetMutators_X.
[0009.08] Log: FJsonStructReader - attempting to overwrite object property 'Class' for struct PresetMutators_X.
[0009.08] Log: FJsonStructReader - attempting to overwrite object property 'Class' for struct PresetMutators_X.
[0009.08] Log: FJsonStructReader - attempting to overwrite object property 'Class' for struct PresetMutators_X.
[0009.24] Log: ProductDatabase_TA::UpdateAvailableProducts 0.16 sec total.
[0009.24] PsyNet: PsyNetConnection_X_0 enabled
[0009.26] PsyNetStaticData: UpdateCacheTimerEnabled CacheTimer.bEnabled=True
[0009.29] Auth: OnlinePlayerAuthentication_TA_0 None Unknown00 SetAuthLoginError [Error Type=OSCS_NotConnected Code=-1 Message=]
[0009.29] Auth: OnlinePlayerAuthentication_TA_0 None Unknown00 Logout
[0009.29] PsyNet: PsyNetConnection_X_1 disabled OSCS_NotConnected
[0009.29] Auth: OnlinePlayerAuthentication_TA_0 LoggedOut Unknown00 HandleConnectionChanged
[0009.29] Auth: OnlinePlayerAuthentication_TA_0 LoggedOut Unknown00 UpdateLoginState
[0009.29] Auth: OnlinePlayerAuthentication_TA_0 LoggedOut Unknown00 SetAuthLoginError [Error Type=OSCS_NotConnected Code=-1 Message=]
[0009.29] Auth: OnlinePlayerAuthentication_TA_0 LoggedOut Unknown00 Logout
[0009.29] Auth: OnlinePlayerAuthentication_TA_0 LoggedOut Unknown00 UpdateLoginState AuthLoginError=[Error Type=OSCS_NotConnected Code=-1 Message=]
[0009.29] SaveGame: Load Player.ControllerId=0 SaveFileName=..\..\TAGame\SaveDataEpic\DBE_Production\0a1a42dccecd4e348f3cf34dbf22013f.save
[0009.30] Error: Error, FSaveDataImportTask: '..\..\TAGame\SaveDataEpic\DBE_Production\0a1a42dccecd4e348f3cf34dbf22013f.save' no files found
[0009.31] DevNet: Browse: MENU_Main_p
[0009.31] Log: LoadMap: MENU_Main_p
[0009.34] Log: Fully load package: ..\..\TAGame\CookedPCConsole\gameinfo_gfxmenu_SF.upk
[0009.36] Log: Fully load package: ..\..\TAGame\CookedPCConsole\gfxsounds_mainmenu_SF.upk
[0009.37] Log: Fully load package: ..\..\TAGame\CookedPCConsole\gfx_startmenu_SF.upk
[0009.43] Log: Fully load package: ..\..\TAGame\CookedPCConsole\gfx_mainmenu_SF.upk
[0009.48] Log: Game class is 'GameInfo_GFxMenu_TA'
[0009.52] Log: *** WARNING - PATHS MAY NOT BE VALID ***
[0009.52] Log: Bringing World MENU_Main_p.TheWorld up for play (0) at 2020.09.24-16.15.33
[0009.52] Log: Bringing up level for play took: 0.031991
[0009.52] Legacy: Unable to update on shell set Error:OSCS_NotConnected
[0009.86] DevGFxUIWarning: Scale9Grid for resource=289 has negative width -2.050000
[0010.01] Log: ########### Finished loading level: 0.695174 seconds
[0010.01] Log: Flushing async loaders.
[0010.34] Log: Flushed async loaders.
[0010.34] Log: Initializing Engine Completed
[0010.34] Log: >>>>>>>>>>>>>> Initial startup: 10.34s <<<<<<<<<<<<<<<
[0010.37] DevOnline: WebRequest_X_1 SEND: https://rl-cdn.psyonix.com/SpecialEvent/Images/rl_5thanniversary_texttreatment-vertical.png
[0010.37] DevOnline: WebRequest_X_2 SEND: https://rl-cdn.psyonix.com/SpecialEvent/Images/rl_5thAnniversary_event-mode-bg.jpg
[0010.37] DevOnline: WebRequest_X_3 SEND: https://rl-cdn.psyonix.com/SpecialEvent/Images/AnniversaryCurrency.png
[0010.37] DevOnline: WebRequest_X_4 SEND: https://rl-cdn.psyonix.com/SpecialEvent/Images/AnniversaryCurrency_Large.png
[0010.37] DevOnline: WebRequest_X_5 SEND: https://rl-cdn.psyonix.com/XPIcons/XP_1.png
[0010.37] DevOnline: WebRequest_X_6 SEND: https://rl-cdn.psyonix.com/XPIcons/XP_2.png
[0010.37] DevOnline: WebRequest_X_7 SEND: https://rl-cdn.psyonix.com/XPIcons/XP_3.png
[0010.37] DevOnline: WebRequest_X_8 SEND: https://rl-cdn.psyonix.com/XPIcons/XP_4.png
[0010.37] DevOnline: WebRequest_X_9 SEND: https://rl-cdn.psyonix.com/Playlists/Images/rl_event_mode_bg_beachball.jpg
[0010.37] DevOnline: WebRequest_X_10 SEND: https://rl-cdn.psyonix.com/Playlists/Images/Limited_Events_Panel_Beachball.png
[0010.37] DevOnline: WebRequest_X_11
submitted by WnT_Hilfry to RocketLeague [link] [comments]


2020.09.15 22:32 longbottomjr Student Programming Assistant worth anything?

I have an opportunity to work as a Student Programming Assistant where I would:
  1. Write and execute SQL and PL/SQL code and scripts against Oracle databases..
  2. Perform various bulk data loads and modifications using the Advance DataLoader and Loader utilities.
  3. Assist in the identification and correction of data integrity issues within the Advance database.
  4. Assist in the occasional creation, design, testing, documentation, and troubleshooting of Advance reports using Crystal Reports and Oracle.
  5. Assist in the merging and coding of external databases with the Advance database.
  6. Assist in the testing and validation of new and existing Advance utilities and tools after each Advance application or database upgrade.
  7. Assist in the creation and maintenance of technical documentation describing Data and Technology Services data procedures and reports.
  8. Assist in troubleshooting minor hardware/software issues.
  9. Assist in intranet/internet website maintenance, content management, and web programming utilizing HTML/PHP/.NET 4.0 (VB and C#).
I would be taking a pay cut from my current NON-CS part-time job. Would taking this job be worth it down the road resume and experience wise?
submitted by longbottomjr to cscareerquestions [link] [comments]


2020.08.21 15:06 nivek_123k Virtualbox nested paging limitation?

Host: Win10x64, VB v 6.1.10. Extension pack is loaded. Proc: Intel(R) Core(TM) i7-7700HQ CPU @ 2.80GHz
Guest1: CentOS 8: Running KVM/QEMU with kvm_intel nested=1, vbox guest additions=yes, and nested-hw-virt=true.
             
libvirt-host-validate:
[[email protected] ~]# virt-host-validate QEMU: Checking for hardware virtualization : PASS QEMU: Checking if device /dev/kvm exists : PASS QEMU: Checking if device /dev/kvm is accessible : PASS QEMU: Checking if device /dev/vhost-net exists : PASS QEMU: Checking if device /dev/net/tun exists : PASS QEMU: Checking for cgroup 'cpu' controller support : PASS QEMU: Checking for cgroup 'cpuacct' controller support : PASS QEMU: Checking for cgroup 'cpuset' controller support : PASS QEMU: Checking for cgroup 'memory' controller support : PASS QEMU: Checking for cgroup 'devices' controller support : PASS QEMU: Checking for cgroup 'blkio' controller support : PASS QEMU: Checking for device assignment IOMMU support : WARN (No ACPI DMAR table found, IOMMU either disabled in BIOS or not supported by this hardware platform) 
Not sure IOMMU is necessary, so ignored.
I've loaded the necessary packages on CentOS8 guest to host KVM/QEMU. Also installed the guest additions (which I'm not sure are relevant for nested virt).
[[email protected] ~]# modprobe -r kvm_intel [[email protected] ~]# modprobe kvm_intel nested=1 [[email protected] ~]# cat /sys/module/kvm_intel/parameters/nested 1 
After that, dmesg reports:
[ 1637.692524] Processors without extended page tables or support for shadow VMCS are not recommended by Red Hat for nested virtualization. 
Guest 1-a Alpine Linux 3.8 /w virt kernel: (First guest on the CentOS guest: nested/nested)
 internal error: process exited while connecting to monitor: 2020-08-21T13:05:40.499629Z qemu-kvm: warning: host doesn't support requested feature: MSR(48CH).vmx-invvpid-single-addr [bit 40] 2020-08-21T13:05:40.499643Z qemu-kvm: warning: host doesn't support requested feature: MSR(48CH).vmx-invept-single-context [bit 41] 2020-08-21T13:05:40.499656Z qemu-kvm: warning: host doesn't support requested feature: MSR(48CH).vmx-invvpid-all-context [bit 42] 2020-08-21T13:05:40.499670Z qemu-kvm: warning: host doesn't support requested feature: MSR(48CH).vmx-invept-single-context-noglobals [bit 43] 2020-08-21T13:05:40.499687Z qemu-kvm: warning: host doesn't support requested feature: MSR(491H).vmx-eptp-switching [bit 0] 2020-08-21T13:05:40.501076Z qemu-kvm: error: failed to set MSR 0x48b to 0x1584d00000000 qemu-kvm: /builddibuild/BUILD/qemu-2.12.0/target/i386/kvm.c:2119: kvm_buf_set_msrs: Assertion `ret == cpu->kvm_msr_buf->nmsrs' failed. 
On this guest, only emulated processors work. Any combination of "host_model" or "host_passthrough" will fail, sometimes even locking up the CentOS VM. VMWare Player seems to do nested/nested just fine, but virtualbox seems to only do 1 layer of nested vtx?
Is this a limitation of virtualbox, or have I done something grossly wrong?
submitted by nivek_123k to virtualbox [link] [comments]


2020.08.19 11:30 hakkmj Retrieving Info From Weather API

I am trying to integrate some weather information into a program I am writing for my office. I am using the climacell api. I've not used JSON in vb.net before, so it's a bit of a learning curve.
I have constructed the api call and it sends JSON formatted data back to me no problem. The problem I am having is making use of this information. Everything I have tried so far to deserialize the data gives an error.
I can print the returned JSON straight to the console and this is part of what is returns:
[{"precipitation":[{"observation_time":"2020-08-20T03:00:00Z","max":{"value":0.875,"units":"mm/hr"}}],"precipitation_probability":{"value":85,"units":"%"},"weather_code":{"value":"rain_light"},"observation_time":{"value":"2020-08-19"},"lat":54.88657,"lon":-1.50226}:
. . . . . . . etc
I am using the Newtonsoft Json library to convert the result:
Dim myJson = JsonConvert.DeserializeObject(Of String)(web.DownloadString(_strurl))
and the error I get on this line is:
"Unexpected character encountered while parsing value: [. Path ", line 1, position 1."
I'm puzzled as to what I'm doing wrong here, the returned information is valid JSON data according to the jsonformatter checker.
So what do I do to pull the information from the returned JSON information and maybe put it into a datatable so I can use it?
submitted by hakkmj to visualbasic [link] [comments]


2020.08.16 21:22 Lion_TheAssassin After following the installation instructions for Mono develop for Debian 10, i cannot find the app icon on my Chromebook. I can see the Mono --version but the icon does not show up.

As title says, i followed all the command prompts to Install Mono for Debian 10. I had it before hand but when i tried to load a Vb,Net project it would not load properly as it did not support .Net 4.5 and checking the Version showed me only the 4.6.2 version. i deleted it and tried to install it again. Except now though i am able to see a more up to date version i cannot find the App on my chromebook launcher. Is there a command to push the icon on to the launcher? Or did i do something wrong? For refererence though i can see the mono version i do not seem able to launch from the terminal either.
this is the output when i try to do it
Cannot open assembly 'hello.vbproj': File does not contain a valid CIL image. my Chromebook is the Cb plus V1. And i was able to open the app before i deleted it. 
submitted by Lion_TheAssassin to Crostini [link] [comments]


2020.08.13 13:07 vb-net Blazor syntax and opportunity (what missing in Blazor?)

http://www.vb-net.com/2020/Index.htm
submitted by vb-net to Blazor [link] [comments]


2020.08.04 16:41 Idiot_COCK Stop complaining about bilingual people switching between languages

This shouldn't need to be said, but apparently it does. Read the whole post before commenting or voting. I'm seeing quite a few comments who fundamentally disagree with what I'm saying, which shouldn't be necessary or possible since everything in this post is true. This is not an opinion piece, and if you question the legitimacy of any of my claims: read the sources I have provided. Despite what some people say, this post is not a strawman; if you read this and don't disagree with what I'm saying, then congrats, you're not the target of this post. Likewise, if you do disagree with what I'm saying (which a lot of people are), or take issue with a part of it, then you are indeed the intended audience of this post, so read it.
This rant is really only directed at a few certain people, but I've seen this a lot, and I'm sure you have too.
There is X character in X media, that speaks two languages and frequently switches between the two. Cue in people complaining, "Wah, why is X character talking like that??? That's SOOOOO dumb!!! No one actually talks like that in real life!!!"
!?!?!? Well guess what, ya pea brain, bilingual people actually fuckin do. Maybe if you interacted with more people, or just understood that there are different types of cultures/people that aren't apart of your small bubble, you wouldn't be complaining about this, since people actually do talk like that. Yes. There are people who speak multiple languages that ACTUALLY switch between their native and secondary languages all the time. GASP!! WHO KNEW?!?
This phenomenon is known as "code switching," with the more extreme example being "mixed languages."
Code switching is probably exactly what you think it is by now: it's when someone who speaks multiple languages, switches between them in everyday conversation.
The reasons why people do vary; I'm just gonna copy and paste this section from Wikipedia, since it explains better than I could:
A particular topic: People generally switch codes during discourse about a particular topic when specific language is necessary or preferred; alternative speech may better convey relevant concepts.
Quoting someone: People will switch codes while quoting another person.
Solidarity and gratitude: When expressing gratitude or solidarity, code-switching can occur inadvertently or with the intention of fostering a rapport.
Clarification: A speaker may engage in code-switching when listeners have difficulty comprehending specific words or concepts initially, or when the speaker does not know or remember the appropriate words in one of the languages.
Group identity: People may alter their language to express group identification. This can happen, for example, when introducing members of a particular group to others.
To soften or strengthen command: While asking someone to do something, code-switching works to mark emphasis or provide inspiration.
Lexical need: People often use technical or idiomatic speech from a foreign or non-primary language; code-switching occurs when translating such words or phrases could distort the precise meaning.
There's different types of code switching as well, and again, right from Wikipedia:
Intersentential switching occurs outside the sentence or the clause level (i.e. at sentence or clause boundaries). In Assyrian-English switching one could say, "Ani wideili. What happened?" ("Those, I did them. What happened?")
Intra-sentential switching occurs within a sentence or a clause. In Spanish-English switching one could say, "La onda is to fight y jambar." ("The latest fad is to fight and steal.")
Tag-switching is the switching of either a tag phrase or a word, or both, from one language to another, (common in intra-sentential switches). In Spanish-English switching one could say, "Él es de México y así los criaron a ellos, you know." ("He's from Mexico, and they raise them like that, you know.")
Intra-word switching occurs within a word itself, such as at a morpheme boundary. In Shona-English switching one could say, "But ma-day-s a-no a-ya ha-ndi-si ku-mu-on-a. ("But these days I don't see him much.") Here the English plural morpheme -s appears alongside the Shona prefix ma-, which also marks plurality.
Mixed language is similar to code switching, but instead of it only being random words/phrases switched, they are entire languages who have been fused together and developed by bilingual communities.
And if you have problems with my use of Wikipedia, visit the page itself, and check their sources, since typically those are reputable and viable.
So yeah, bilingual people do indeed mix languages frequently. As someone who grew up in a fairly bilingual city, with a bilingual family, I've met hundreds of people who switch between their languages, because, hey, that's what they fuckin do.
The next time someone claims otherwise, I'm gonna have a fucking aneurysm. And I see this much too often, with it not being limited to only this trope; where there are people who misunderstand or dislike a trope simply because they don't understand the history or the real life application of it. If you're gonna complain about something in media, especially a trope, please do some research on the topic/idea before hand, because there's probably a genuine explanation and reason on why that trope exists and is used.
And like all tropes: no tropes are bad, there is only bad execution. If you do have a problem with a bilingual character mixing their languages because it sounds awkward or unnatural, then it's probably just due to poor writing. Don't blame the trope, blame the writer.
EDIT: This post is NOT meant to disregard any criticism of how bilingual characters speak or act in fiction, there are certainly writers who are unskilled and lazy with writing bilingual characters, and it comes off as disingenuous and stupid.
But, there are different forms of code switching as well, so someone who is unaccustomed to how the different ways it is used, or claim that bilinguals don't speak a "certain way," when in certain areas they actually do, this post is meant to explain that. Different cultures and areas express themselves in different ways with their own unique forms of language and prose; simply because you have not come across people who speak in a certain way, does not mean there are people who don't.
Bilingual people mainly invoke code switching when speaking to other bilingual people. Code switching can also occur when a speaker converses with a non bilingual person as well, but it is much less common, and the conversation/language will obviously not be as a complex compared to when they are speaking to someone who fully understands them; they may only employ a few words or phrases from their native tongue: something relatively easy to understand, even with the language barrier.
If you are genuinely curious and want to learn more about the different forms of code switching and dialects, read the wikipedia article and its sources, or any of these research articles/articles: [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] I'm seeing some anecdotal comments which seem to try to argue against what I'm saying; which wouldn't be necessary if they understood the full breadth and variance that exists with code switching.
EDIT 2: And yes, even the stereotypical Mexican who says "cabron," "pendejo," and "amigo" does exist in real life, and you'd probably see them somewhere in suburbs or cities of the US. Most fictional media presenting bilingual Spanish characters are American based, so this stereotype isn't entirely unfounded. That isn't to say, that ALL American Mexicans or native Spanish speaking characters from their home countries speak like this, since they have their own unique dialects and forms of communication. Language varies across the globe, and for the same reason UK English differs from American English, different regions and cultures have different forms and methods of code switching.
EDIT 3: Adding this since some people lack reading comprehension, and the conversation keeps steering itself towards discussion of the Mexican stereotype in media, with the popular example commenters bring up being Manny, from the TLOU2:
"Lazy" code switching such as the type that Manny does, where he throws in random Spanish words, is a REAL THING. He's a character based in a United States, so in any normal circumstance, this method of speaking is correct. If Manny was a native speaker from a Latin country, then it would be incorrect for him to speak like that, since that type of code switching is a California/United States phenomena. Manny is using Spanglish.
(You can still have a problem with his method of speech, since realistically he wouldn't even have this dialect; he was born after the outbreak, and unless he was raised in a Spanish/English community, he would just talk regular English like the rest of the cast. And at the end of the day, he is still a stereotype.)
Manny's characterization is a negative stereotype and that deserves criticism, but the way he talks isn't unheard of, and that is partly what I'm defending and discussing in my post.
The overuse of a certain depictions of characters of different ethnicities and their speech, such as the Latino who curses in Spanish and the German who speaks in a funny accent, deserves criticism. There should be a more variation with how authors depict their bilingual characters, instead of picking their accents from a certain region, or certain stereotype, time and time again.
There are people, however, who claim that this stereotype, or rather, their method of speech, isn't real, when it is. People in this comment section have been saying that. And no, that doesn't make it less annoying or less problematic when the more negative depictions come up, i.e. Manny, but that doesn't meant their methods of speech are unrealistic, which people tend to claim.
Depictions of Latino characters who are from their homeland or anywhere that isn't Southern California, who speak with the stereotypical American Mexican stereotype are not correct. We tend so see an over saturation of that stereotype due to laziness on the authors behalf, and/or the location of many large production studios being based in California. That's not to say that those depictions aren't overplayed.
The title of this rant could have been more aptly written, and should have said "Code switching is real, and sometimes the ways it is portrayed in media isn't incorrect."
Also, the information regarding code switching and its types is objectively true; this is not an opinion piece. It is a researched and recorded phenomena. If you question the validity of any of my statements, read the sources I have provided.
TL;DR: Code switching is real, and there are many different types of code switching across the globe. Not all bilinguals speak the same way. Different cultures and areas express themselves in different ways with their own unique forms of language and prose; simply because you have not come across people who speak in a certain way, does not mean there are people who don't.
That's not to say that writers are exempt from criticism, especially when they use incorrect prose/language or they misuse/assign a type of code switching to an incorrect region, such as giving native Spanish speakers from their home country, the stereotypical Mexican American method of speech.
The overuse of certain stereotypes deserves criticism, but when some people claim that these stereotypes/types of code switching aren't real, they are wrong.
submitted by Idiot_COCK to CharacterRant [link] [comments]


2020.07.26 20:14 megapuncher01 TD API - need help with Post Orders please

Need help please. VB.NET.
Where I am:
My application is registered on TDA developer portal. Can authenticate successfully via code. I can use retrieved access token and can retrieve information on my account/post orders on TDA developer portal - it works with no issues.
Where my problem is:
I am testing Post Order to my account. Using same access token (testing token from their Post Access Token and myself generated one) and same order. On their TDA developer's portal it works, i can see the order on ToS. Through the code, it spits Unauthorized error (trying right after portal, so access token is valid).
Exact Error: "The remote server returned an error: (401) Unauthorized."
TDA developer portal has OAuth 2.0 but their Post Schema do not have anything similar.
Any help is appreciated.
Dim request As System.Net.WebRequest = System.Net.HttpWebRequest.Create("https://api.tdameritrade.com/v1/accounts/xxxxxxxxx/orders")
Dim response As System.Net.HttpWebResponse

Dim acctok As String = "xxxxxxxxxxx"
Dim mydata As String = "{" & _
" " & Chr(34) & "orderType" & Chr(34) & ": " & Chr(34) & "MARKET" & Chr(34) & "," & _
" " & Chr(34) & "session" & Chr(34) & ": " & Chr(34) & "NORMAL" & Chr(34) & "," & _
" " & Chr(34) & "duration" & Chr(34) & ": " & Chr(34) & "DAY" & Chr(34) & "," & _
" " & Chr(34) & "orderStrategyType" & Chr(34) & ": " & Chr(34) & "SINGLE" & Chr(34) & "," & _
" " & Chr(34) & "orderLegCollection" & Chr(34) & ": [" & _
" {" & _
" " & Chr(34) & "instruction" & Chr(34) & ": " & Chr(34) & "Buy" & Chr(34) & "," & _
" " & Chr(34) & "quantity" & Chr(34) & ": 1," & _
" " & Chr(34) & "instrument" & Chr(34) & ": {" & _
" " & Chr(34) & "symbol" & Chr(34) & ": PFE," & _
" " & Chr(34) & "assetType" & Chr(34) & ": " & Chr(34) & "EQUITY" & Chr(34) & "" & _
" }" & _
" }" & _
" ]" & _
"}"


request.Method = "POST"
request.ContentType = "application/json"
request.Headers.Add("Authorization", "Bearer <" & acctok & ">")
request.GetRequestStream.Write(System.Text.Encoding.UTF8.GetBytes(MyData), 0, System.Text.Encoding.UTF8.GetBytes(MyData).Count)
response = request.GetResponse
Dim myreader As New System.IO.StreamReader(response.GetResponseStream)
Dim myText As String
myText = myreader.ReadToEnd
submitted by megapuncher01 to algotrading [link] [comments]


2020.07.16 16:40 Monopolista DKIM-Result: fail (bad signature) with OpenDKIM and Postfix

EDIT: OpenDKIM looks like it won't support ed25519 selectors any time soon, alternative milters are dkimpy-milter and rspamd
EDIT: Fixed, I'm using a KeyTable + SigningTable setup that seems to work now /etc/opendkim/opendkim.conf LogWhy Yes Syslog Yes SyslogSuccess Yes Socket local:/varun/opendkim/opendkim.sock UserID opendkim ReportAddress [email protected] SendReports yes Canonicalization relaxed/relaxed Mode sv UMask 002 BaseDirectory /valib/opendkim TemporaryDirectory /varun/opendkim PidFile /varun/opendkim/opendkim.pid Domain domain.tld KeyFile /etc/opendkim/privkey.pem InternalHosts refile:/etc/opendkim/hosts ExternalIgnoreList refile:/etc/opendkim/hosts KeyTable refile:/etc/opendkim/key_table SigningTable refile:/etc/opendkim/signing_table OversignHeaders From MilterDebug 9 ResolverTracing Yes
/etc/opendkim/key_table opendkim._domainkey.domain.tld domain.tld:opendkim:/etc/opendkim/keys/domain.tld/opendkim.private
/etc/opendkim/signing_table *@domain.tld opendkim._domainkey.domain.tld
tree /etc/opendkim/ ├── hosts ├── key_table ├── keys │ └── domain.tld │ ├── opendkim.conf │ └── opendkim.private ├── opendkim.conf ├── privkey.pem └── signing_table
Original message:
I'm trying to set up DKIM for our postfix server, here there are the configurations:
/etc/postfix/main.cf ``` smtpd_use_tls = yes smtp_dns_support_level = dnssec smtp_tls_security_level = dane smtpd_tls_cert_file = /etc/letsencrypt/live/domain.tld/fullchain.pem smtpd_tls_key_file = /etc/letsencrypt/live/domain.tld/privkey.pem
smtputf8_autodetect_classes = sendmail, verify smtputf8_enable = ${{$compatibility_level} < {1} ? {no} : {yes}} strict_smtputf8 = no
smtpd_milters = unix:/varun/opendkim/opendkim.sock non_smtpd_milters = $smtpd_milters milter_default_action = accept ```
/etc/opendkim/opendkim.conf ``` Domain domain.tld KeyFile /etc/opendkim/privkey.pem Selector opendkim Socket local:/varun/opendkim/opendkim.sock UserID opendkim Canonicalization relaxed/relaxed BaseDirectory /valib/opendkim TemporaryDirectory /varun/opendkim UMask 002

the following file contains just 12.0.0.1, localhost and domain.tld

InternalHosts /etc/opendkim/hosts ExternalIgnoreList /etc/opendkim/hosts
LogWhy Yes Syslog Yes SyslogSuccess Yes MilterDebug 9 ResolverTracing Yes ```
I have generated the txt record and drill txt opendkim._domainkey.domain.tld -t returns: ``` ;; opendkim._domainkey.domain.tld. IN TXT
;; ANSWER SECTION: opendkim._domainkey.domain.tld. 2399 IN TXT "v=DKIM1; k=rsa; p=MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAujY4bjiuY+CxcgTrF42+gMgTsjaGe/XSbgkQCIx3uS3iw+emqB4wyujmU9lycv8kgdu9n1aqu5S8KLlRGk0JsJhJawf2B7Fdxl9/rtnavau0NYVYgu3jtrhyA9khTbXb7/3z5OHHL36KLVRFdPi+7S6SXas7BGloTuPQa1YmNb09y6ouCEYA3vRBA3uklXd" "M3aTVYKY1WZbZtS8C4WRYzlj4J91bCE9UAyFbiFu/RpazVhjDCdKte4/UHZbXG6IDc6w6bWJqgs2Fer1VXudpNoy692tcGYHmazwTuXem1P9ItwlOu0KuuBgcLKDzK4mKMvwfHVINKM/Myoy55wIDAQAB" ```
I have another DNS record for sendgrid, default._domainkey.domain.tld: v=DKIM1; k=rsa; p=MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA4uo7Nto36e/Gx9KftLxvwkKOlIw6lmVQkGMu5Q6XeBDzUFC8hHafoHkHosg86TySOa/vE1x5hM505HUpvnhgdFUQIfEG/UwgeXSNJ0LOdN+d/X76SqOUBfiSk
These are both valid according to online DKIM record checkers. However https://www.appmaildev.com/en/dkim returns the following if I send trough Postfix:
``` DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=domain.tld; s=opendkim; t=1594909010; bh=91RazCIc2UKGe+HBO/tFDtuL2v1Tzk19s+Q44fL6gEY=; h=Subject:Date:From; b=a20CL2OU8ekxLmXZZ+F3cnWE4iwFn7aYuzoUMDyJBoIm5+BLSJ2Md+SokBOfbgFse iiNn3ik8jKs6WSowzch0S51zQDH96qksH6/PP/hVSNUxvoyt6VHVxiI/N5jXaUz2bP IKypQWIBSoFWpWI9Oq6Yad18fwI2wVzjDdzn6c6mLiq1mIRrmhmGykjbCLTxForv5U 6F2OQONegE49N+Vx4pg2eqnCVrA3IhrlwC+bYIjak0pIBwGJ826yqN0LMQrUFREdWd 2WNKGdLtm71OoQgN52UOP+utn2MlmXZ4T9JGO5H+YbQRbyuAzAZ31EMbrEOoyqbEgI LklT6tFJz2dAw== Signed-by: [email protected] Expected-Body-Hash: 91RazCIc2UKGe+HBO/tFDtuL2v1Tzk19s+Q44fL6gEY= Public-Key: v=DKIM1; k=rsa; p=MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAujY4bjiuY+CxcgTrF42+gMgTsjaGe/XSbgkQCIx3uS3iw+emqB4wyujmU9lycv8kgdu9n1aqu5S8KLlRGk0JsJhJawf2B7Fdxl9/rtnavau0NYVYgu3jtrhyA9khTbXb7/3z5OHHL36KLVRFdPi+7S6SXas7BGloTuPQa1YmNb09y6ouCEYA3vRBA3uklXdM3aTVYKY1WZbZtS8C4WRYzlj4J91bCE9UAyFbiFu/RpazVhjDCdKte4/UHZbXG6IDc6w6bWJqgs2Fer1VXudpNoy692tcGYHmazwTuXem1P9ItwlOu0KuuBgcLKDzK4mKMvwfHVINKM/Myoy55wIDAQAB;
DKIM-Result: fail (bad signature) ```
If I set Sendgrid as relay, instead, it passes: ``` DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=domain.tld; h=subject:from:content-type:content-transfer-encoding; s=s1; bh=91RazCIc2UKGe+HBO/tFDtuL2v1Tzk19s+Q44fL6gEY=; b=F0wcRohkfCPToHajx2CqybsT1dNpwWXWhO5IFoURVbYiUdW3nAl3/VK/2ts7/qPnHJmH KtmjBa8L2qizSUebP6ZngOmyQbF+EZd3YQsZZSiOFQZg556O7vmcxZpCJhkP3jymfkjIzF Om9vMEN8WMHAVeXG/JWPasXDYXZkl+nvM= Signed-by: [email protected] Expected-Body-Hash: 91RazCIc2UKGe+HBO/tFDtuL2v1Tzk19s+Q44fL6gEY= Public-Key: k=rsa; t=s; p=MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQC/NCMbyTlZN/Aoxr1h16709mwRetG8ZT7wNnRFUUzEEdO7FzpNGojFg5rJzjy8Du075idi9Gh5AWaLoYZLgYUOzCo4wH5Ujy6BeXIZdUQlQLB3UHfHirqKDFudGr150Ea8u+HuPYEbkzPkvjS0WIFT2UXTV3ov2Qc3O0tVB/+CnwIDAQAB;
DKIM-Result: pass ```
Logs from postfix for the fail email: Jul 16 16:16:49 hostname postfix/pickup[11174]: E6A3A7FAE8: uid=1000 from= Jul 16 16:16:50 hostname postfix/cleanup[11255]: E6A3A7FAE8: message-id=<[email protected]> Jul 16 16:16:50 hostname postfix/qmgr[11175]: E6A3A7FAE8: from=, size=220, nrcpt=1 (queue active) Jul 16 16:16:51 hostname postfix/smtp[11258]: E6A3A7FAE8: to=, relay=appmaildev.com[13.67.59.48]:25, delay=2.9, delays=1.8/0.1/0.49/0.5, dsn=2.6.0, status=sen> Jul 16 16:16:51 hostname postfix/qmgr[11175]: E6A3A7FAE8: removed
Logs from opendkim for the fail email: Jul 16 16:16:50 hostname opendkim[11238]: E6A3A7FAE8: DKIM-Signature field added (s=opendkim, d=domain.tld)
Logs from postfix for the pass email: Jul 16 16:41:53 hostname postfix/pickup[11804]: F33E67FAE8: uid=1000 from= Jul 16 16:41:54 hostname postfix/cleanup[11809]: F33E67FAE8: message-id=<[email protected]> Jul 16 16:41:54 hostname postfix/qmgr[11805]: F33E67FAE8: from=, size=220, nrcpt=1 (queue active) Jul 16 16:41:55 hostname postfix/smtp[11812]: F33E67FAE8: to=, relay=smtp.sendgrid.net[159.122.219.55]:587, delay=1.3, delays=0.33/0.1/0.79/0.07, dsn=2.0.0, s> Jul 16 16:41:55 hostname postfix/qmgr[11805]: F33E67FAE8: removed
Logs from opendkim for the pass email (why did it pass through opendkim?): Jul 16 16:41:54 hostname opendkim[11238]: F33E67FAE8: DKIM-Signature field added (s=opendkim, d=domain.tld)
What is wrong with my setup? I tried a 1024 public key but nothing changes, DNS server is Cloudflare
submitted by Monopolista to sysadmin [link] [comments]


2020.07.14 08:27 Davidgogo Evidence of God in plain sight Part III of III

Evidence of God in plain sight Part III of III
Layer 5
Here is another gem that warrants a new layer. Dr. Haifeng Xu and Zuyi Zhang with Ali Adams have also done some fantastic work on the subject. Their findings on the Surat Fatiha is amazing. There are 7 verses, 29 words, and 139 letters in the chapter. What they have done is to identify a series of what they call the Quran Triplets. In this particular case, it computes like this: 7=7, 2+9=11, 1+3+9=13 and results in prime numbers in either direction 729139 and 139297 with the former resulting in a prime digit sum 7+2+9+1+3+9=31 as well. Once again, I think we are just scratching the surface here.
The kicker is, when a slightly different and more popular spelling convention is considered, the 7 verses and 29 words stays the same, as is the case with the previous spellings, but the letter count increases to 143. The remarkable thing to note is that 143 also happens to be a prime number. Not only that but the new sum results in yet another prime, 179. Furthermore, when 'wah' is taken as a separate word, the word count changes to 31(a prime number). Now we have a new set 7+31+143=181. Yes, 181 is also a prime number. It seems the Quranic patterns have a very high tolerance level. Even when we stick to the original 139 letters and count 'wah' as a separate word the resulting count is 7+31+139=177, another additive prime.
And what does it say in the first chapter? It is not some gibberish in order to make all these patterns come together. The words of the chapter are some of the most profound words ever written. Even in translation the sense is there:
1:1 In the Name of God, the Compassionate, the Merciful.
1:2 Praise be to God, the Lord of the worlds,
1:3 the Compassionate, the Merciful,
1:4 Master of the Day of Judgement.
1:5 Thee we worship and from Thee we seek help.
1:6 Guide us upon the straight path,
1:7 the path of those who Thou hast blessed, not of those who incur wrath, nor of those who are astray.
The poetic brilliance is hard to translate but notice how the narrative is split either side of the verse that mentions worship and help (verse 5). Glorification verses are in the top part and those seeking help are at the bottom part. This obviously is in addition to the prime numbers array detailed above.
If that was not enough, there is another layer within this layer, on top or below the one detailed above. Like so many other verses and chapters, this verse is also written by utilizing only a limited number of letters; in this case 21(7x3) letters of the Arabic alphabets are used. Furthermore, each verse of the chapter either ends with Nun
(N) or Meem (M). When all the words ending in Nun are added up, the total is 7 and not surprisingly, all the words ending with Meem also add up to 7. In fact, the chapter is even more tightly knit around the number 7. Please go here to download the book for more details
While we are on the subject of the first chapter, here is yet another layer that adds to the complexity from another angle.
In Kaheel's words:
The Qur’an’s opening chapter consists of seven verses, and as such, each verse ends with a specific word, which acts as a kind of interval or break, separating the verses from each other.
The number representing the letter count of each of Al-Fatiha’s intervals is
7865676, which is indeed a multiple of 7:7865676 = 1123668 x 7
But that’s not all, because the result is also a multiple of 7: 1123668 = 160524 x 7And this result is also a multiple of 7: 160524 = 22932 x 7And this result is again a multiple of 7: 22932 = 3276 x 7This result is yet again a multiple of 7: 3276 = 468 x 7
In other words, our original number 7865676 is a multiple of 7five times!!
7865676 = 468 x 7 x 7 x 7 x 7 x 7
Prof. Ali R. Fazely has taken the prime number aspect to a completely new level. He has covered from twin primes and lonely primes to all the way to Mersenne Primes and Gaussian Primes. Let us not forget that most of the heavy lifting in the Prime numbers field was done in the fifteenth century onward.
A sample of the finding associated with the good professor can be gauged from examples like this:
The number of Chapters (Sura) in the Quran are 114 which is a multiple of 19, 114 = 6x19. Prof. Ali R. Fazely discovered that the number 619 is prime number, and it is the 114th prime.
His findings are still being explored, as one would expect anything that is related to prime numbers to be. But the very fact that the tentacles of these Quranic patterns deep dive into the yet to be explored facets of Mathematics, is certainly food for thought.
Note: A word of caution;the good professor is apparently a follower of Rashad Khalifa, who unnecessarily insisted on dropping the last two verses of chapter nineteen. Some of his finding appears to be directed towards that cause, but perhaps out of loyalty rather than any real merit. But he is after all an astrophysicist and his first loyalty is to scientific proof. I am sure after analyzing the data from other researchers he will think differently.
Layer 6
Perhaps it would be appropriate to now add an aspect of the patterns that put this notion of the last two verses of chapter nine to rest. The number 1957 in the above example of Jesus and Adam shows that even with the original text of chapter nine, the pattern holds. Dropping the two verses also disturbs the total number of times Allah is mentioned, 2699, befittingly a prime number.
The ‘odd even’ conundrum. Let’s keep two numbers in mind:
  1. The sum of the series 1+2+3+4+5 ………. 110+111+112+113+114 = 6555
  2. The total number of verses in the ubiquitous Quran 6236
The two numbers result in 12791 when added up, which is another prime number.
These findings are based on the work done by Dr. Eng. Halis Aydemir. (His work reportedly is based on earlier researchers) The total number of discoveries constitute an entire book, but we will focus on the one finding that validates the integrity of the verse and chapter structure of the Quran. At the same time, it will dispel notions where claims of addition or subtraction of verses is concerned.
The findings are pretty straightforward, if we were to first create two columns, one for the index of the chapters of the Quran and the other for the number of verses contained in that particular chapter. After that, all we will do is simply add the index to the number of verses of the corresponding chapter and write the results in a third column. Hence the first entry would be 1+7 = 8; second would result in 2 + 286 = 288 and so on. This would give us sums that would either be an even number or an odd number. Next, we add two more columns and write the odd sums in one and the even numbers in the other. What we get is 57 instances of odd numbers and 57 instances of even numbers, a statistical probability but still a perfect balance. Lastly, we add up all the numbers in each of the last two columns. The even column adds up to 6236 and the odd gives us 6555. The very same numbers we started out with.
Professor Ali R. Fazely pops up even here. This an excerpt from his paper from way back in May 1991:
“Let me now go back to the EVEN and the ODD. Let us look at the number of verses in each chapter and see if it is odd or even? If we do this, we find that there are 60 chapters in the Quran which possess an even number of verses and 54 which possess an odd number of verses (60+54=114). Of course, if you add these two numbers you obtain 114 as you should. However, the intricacy of the Quran’s mathematical code becomes more overwhelming if we examine these numbers as the indices of Prime numbers.
What do I mean by that?
For example, the 54th Prime number is 251 and the 60th Prime number is 281. Now let us add them up. 251 + 281 = 532 and 532 = 28 x 19
This is only part of the story! Remember that there are 112 verses in the Quran which are not numbered. These are the Basmalahs which are in the beginning of every chapter except for chapter 9. What happens if we include these verses in the total number of verses in each chapter?
We obtain 52 even numbers and 62 odd numbers.
The 52nd Prime number is 239 and the 62nd Prime number is 293.
Let’s add them up. 239 + 293 = 532 and 532 = 28 x 19 Glory be to Allah. How is that for a miracle. Do not relax! There is still more!
We have two pairs of Prime numbers 251, 281 and 239, 293. Let’s add up all the digits in these two pairs and see what we get! 2 + 5 + 1 + 2 + 8 + 1 = 19 and 2 + 3 + 9 + 2 + 9 + 3 = 28
A single addition or subtraction will result in the collapse of this amazing array. One can always drop verses in multiples of two and then claim that the new count is the correct number of verses but that just flies in the face of the actual Quran in circulation. Mind you this is not simply a case of getting the even and odd right, we must bring in the average number of verses in each chapter as an additional variable. (To claim that it is an unintentional coincidence is even more bizarre, especially when it is referenced in the Quran.) Failing which, the total sums don’t add up to the two very relevant numbers. Mind you this is not the only instance of the odd even patterns. For a quick visual confirmation of the chapter wise distribution please refer to the illustration below.

https://preview.redd.it/lov5uuh780b51.png?width=602&format=png&auto=webp&s=c41648e8a6bbff7c479915abbd8e3488daf6125e
Chapters 1 to 57 fig

https://preview.redd.it/r44ntvt980b51.png?width=602&format=png&auto=webp&s=558e5b9b11fa765c1bb46d6a906bad7dfcc088dd
Chapters 58 to 114 fig
Layer 7
Allow me to share an aspect of the Quran that in my opinion is complex enough to amaze just about anybody and yet easy enough to verify within minutes. This is based on the excellent work by Abduldaem Al-Kaheel. This is not something new, in fact his work has been around for almost two decades. It is a pity that all the attention of this space is hogged by the 19 based patterns and most people stop looking after being exposed to the “controversy”.
The interesting thing about this particular case is that one can plot and visualize the core of the phenomenon on a page and half.
The most repeated verse in the Quran by far is:
“Then which of the Blessings of your Lord will you both (jinns and men) deny?”
In fact, it is repeated word for word 31 times in the same chapter. Hence relatively very easy to verify and validate. No need to run to the “experts” or juggle dozens of variables.
What Kaheel noticed was that in chapter 78 Al Rehman, the above verse is repeated with a very distinct pattern. The first occurrence of the verse is at 13th verse of the Sura Rehman and the last at 77th. I will copy paste from his article the entire chapter for easy reference. The verse in question is in bold for easy verification.

https://preview.redd.it/h67494pw80b51.png?width=602&format=png&auto=webp&s=4bc57d21d51c2e5dedbae8021a50d8d2417f5bce
Surah Rehman is a profound commentary on the world around us, in fact the references are not confined just to this world but extend all the way to the Universe beyond our Universe, for want of a better term. Our attention is drawn to the symmetry and balance of God’s creation and true to the layered nature of God’s message we are gifted with what can only be described as a uniquely amazing sign.
Kaheel’s claim is pretty straightforward and consistent with all his other work which in turn is inspired by Quran 15:87 as previously indicated. What Kaheel noticed was that the 31 occurrences of the verse in the chapter follow a complex pattern based on the number 7. Although the moving parts of the pattern are restricted to just four variables and the number 31 itself, the resulting pattern is mind blowing.
Just like his other findings, it can be demonstrated that every aspect of Quran’s order belongs where it is currently positioned.
Kaheel’s methodology is consistent throughout his work and is based on repeating sevens. That is, he places one number in front of the other in a place notation pattern and reads the resulting number as it appears. Please go here and refer to Part 1 to see a detailed explanation. But here is an extract:
“Many researchers into the Qur’anic numbers have attempted to extract a numeric miraclefrom the Qur’an, and most of these studies have concentrated on adding letters and wordstogether.
But mathematics has revealed that more complex methods can be employed, such as positional notation, which we have used extensively to reveal a new, dynamic andcaptivating miracle of numbers.
The mathematical technique known as positional or place-value notation has proven its effectiveness and brilliance across the entire Qur’an. The magnificence of this concept lies in its simplicity. People from all walks of life and areas of knowledge use it every day. Yet because it has no limits, massively large numbers are often achieved, which only adds to the awe-inspiring feeling one receives when such numbers turn out to be perfect, decimal-free multiples of 7, or even multiples of 7 twice, three times or more.
To explain this concept, we start by saying that every number is composed of digits, and every digit in that number possesses a place value (ones, tens, hundreds, thousands, etc.).
Also, every place value is ten times greater than the one preceding it. The origin of this system can in fact be taken from the Qur’an itself, where God Almighty speaks of the rewards of those who perform good deeds and specifically mentions the number 10:
He that doeth good shall have ten times as much to his credit... Al-An‘am, 6:160
We can understand this system by writing a chain of numbers based upon the number 10:
1 – 10 – 100 – 1000 – 10000 – 100000 – .......
Every number in the chain is ten times greater than the number before it. To give a more practical example, we know that the number of verses in the Qur’an is 6236. Each of this number’s four digit has a place value:
6 2 3 6
thousands hundreds tens ones
This can be represented in numerical form:
6 2 3 6
1000 x 6 100 x 2 10 x 3 1 x 6
The sum of this chain is of course the original number:
6000 + 200 + 30 + 6 = 6236
After arranging the numbers of the 31 Verses shown above, the result is a huge 62-digit number. 13 is the first
13161821232528303234363840424547495153555759616365676971737577
When you divide this number by 7, we get
=1,880,260,176,075,471,890,623,405,774,935,356,450,507,965,659,480,810,995,962,511
Here is where the complexity is taken to another level. When you divide the reverse of the above number taken pair by pair, that is take the pair of 77 and then place the next pair of 75 and so on all the way to 13 then we get
77757371696765636159575553514947454240383634323028252321181613
and when we divide this number by 7, we get
= 11,108,195,956,680,805,165,653,650,502,135,350,605,769,090,617,575,464,617,311,659
It is also wholly divisible by 7
Had we reversed the number without the positional consideration, that is reverse the number by unpairing the two respective number pairs referencing each occurrence of the verse we get
77573717697656361695755535159474542404836343230382523212816131
Now this number cannot be used as a place holder for each of the 31 verses in question because, although the first two digits 77 will hold the 31st occurrence of the verse in question correctly, However, we run into trouble with the very next two digits 57 and it loses the property of divisibility by 7 both ways.
Hence if we were to divide the above number by 7, we will get 11,081,959,671,093,765,956,536,505,022,782,077,486,405,191,890,054,646,173,259,447.28
It is not wholly divisible by 7. We get a decimal value of .28
Please feel free to verify the calculation for yourself here
In Kaheel’s words yet another 31 is embedded in the chapter
“Another beautiful consistency I noticed relates to the following Verse of the Chapter we dealt with, The Beneficent. It reads: (We shall attend to you, O you two classes (jinns and men)! (The Beneficent: 31). In Arabic, these ‘jinns and men’ are referred to in this Verse in just one word, namely (Ath-thaqalan). It should be noted that this is the only Qur’anic reference to both mankind and jinns collectively in one word. So, I wondered: is there any relationship between this Verse and the repeated Verses, which directly address jinns and men?
I realized that that Verse was the 31st Verse of the Chapter! In other words, the number of repeated Verses which directly address jinns and men are 31, and so is the Verse number where they are collectively addressed in one word. Again, we find that this consistency could not have easily arrived by chance. ”
Note: Kaheel has one more element on his site but I can’t make it work so I excluded it. It has to do with the sum of all the numbers from 1 till 31.
Ali Adam had noticed that if one were to add the verse numbers where the phrase is to be found, the sum of all those verses equals 1433, also a prime number.
13 + 16 + 18 + 21 + 23 + 25 + 28 + 30 + 32 + 34 + 36 + 38 + 40 + 42 + 45 + 47 + 49 + 51 + 53 + 55 + 57 + 59 + 61 + 63 + 65 + 67 + 69 + 71 + 73 + 75 + 77 = 1433, is a prime number.
Note: Although some of Adam’s conclusions are very speculative,the findings themselves are solid.
Layer 8
Last but not least, no write up on the mathematical patterns in the Quran would be complete without touching on what are called Huruf-e-Muqaṭṭaʿāt or the special letter. The combination of 1 to 5 letters from a set of 14 letters (halfof the alphabet) constitute these special numbers. 29 chapters out of the 114 start with them. My research so far points to something significant but the picture is not complete. There are two camps; one is basing their findings on the number 19 and the other is basing it on number 7. I feel that in spite of some interesting patterns emerging, both at times are reaching. However, when we combine all the so far discovered relationships, there is more than enough material to construct a coherent picture.
As I already pointed out, we are only scratching the surface at this stage.
Concluding remarks
If someone were to ask us to devise a system to preserve the content of an evergreen guide for the next couple thousand years, how would we go about doing it? Some thoughts. First of all, that’s a tough one because it is virtually impossible to envision what all to expect in such a long stretch of human development.
Some of the stuff that may come to mind: time-capsule, encryption, build a pyramid, but since it is a guide one must also make it accessible. Make a site around it and pay the hosting fees for 2000 years in advance? Convince people to memorize it en masse, word for word and then not only convince each successive generation do the same for over 50 generations but provide them with the much-needed incentive to do so. I think it is fair to realize we will be up against it. To understand the problem in depth please go here
Now let us take it up a notch. We also must ensure that the guide is the only one that should survive for such a long period of time in its original form. I am not even going to attempt a solution here.
Not only that but devise a mechanism that will enable people in each era to check the guide for authenticity and be able to verify the same independently and from within the guide itself. We are not just looking for longevity but also certainty.
One last thing, the language of the evergreen guide should also survive the time period in such a way that even with the natural evolution of the language the guide’s message is understood perfectly.
And yes, assume you are living in the sixth century.
Of course, God did just that and did it in a dozen different ways. Some of the other mechanisms tend to end up in the hands of the “experts” and people are left once again to believe them. But the above examples are within the reach of all reasonably educated individuals, especially chapter Quran 78, which is so straightforward that even a 12-year-old will be able to verify it within a very short time.
So how does it all come together?
The mathematical patterns embedded in the Quran provide ample evidence/proof that it is the work of an infinitely intelligent being, but the fact that the author of the Quran claims to be God is significant and we will use it in our conclusions. Needless to say in 1400 years since the compilation of the Quran, nobody has even come close to the literary eloquence of the text alone without the added difficulty of embedding mathematical patterns.
The significance of the author of the Quran claiming to be God becomes apparent when we are unable to explain the Quran's presence among us. In the absence of coming up with an alternative explanation, if somebody now turns around and says you are not God to the author so to speak then the logical question must be on what basis is this objection raised. Is the objector more intelligent to be able to second guess the author? The author of the Quran has already demonstrated His infinite intelligence, for want of a more suitable term; the objector has not. In fact, the objector is unable to come up with even a sequence of three short verses in response to the gauntlet thrown down by the author of the Quran. Logic dictates to go with the infinitely intelligent being, call it what you want. I go with God because that is what the being is claiming. Since no other being who has claimed to be God or claims associated with one has demonstrated the same degree of intelligence that would qualify one as a God, once again logic dictates to go with the one which has.
In order to get a sense of what is going on here, imagine playing chess, go, monopoly, snakes and ladders, drafts and bridge on the same tabletop simultaneously. Or for all the gamers out there, imagine playing Call of Duty, Counterstrike, Minecraft, PUB, Fortnite and let’s throw in Packman for effect, all on the same single screen. The analogy is not an exaggeration but an attempt to give a sense as to how one will go about trying to play all these games simultaneously. Now think of the mind that put it all together in a perfectly coherent text that serves as a blueprint for the life here and the hereafter.
Some good starting points.
http://www.kaheel7.com/eng/or Numeric Miracle
Edip Yuksel’s site where the details of the 365 days can be verified
http://www.yuksel.org/e/religion/365days.htm
Dr. Eng. Halis Aydemir centers on the odd-even aspect of the numeric structure.
http://www.symmetricbook.com/The last table is by itself amazing (Not up for the moment)
Some useful findings by Prof. Ali R. Fazely
http://journal_of_submission.homestead.com/
And even in RK's legacy one can find good material, just ignore where they keep bidding for dropping the two verses
www.submission.org/miracle/
http://www.masjidtucson.org/quran/miracle/
And here one can find a powerful tool, as well as a fantastic discovery to do with prime numbers
https://qurancode.codeplex.com/
Note: All the above calculations are based on the Hafz version of the Quran, the one used by over 98% of Muslims worldwide. In some models the word ‘wah’ (and) is treated as a separate word, in others it is used in conjunction with the following word. The premise being that God’s claim in the Quran that He is responsible for its gathering, distribution and preservation did result in the correct version to become ubiquitous.
Again, you don't have to take anybody's word for it, just do the math. The internet is full of so called “debunked” artists, each claiming that some part or other has been proven wrong and/or mere coincident. Some of these folks should not be allowed to come near the words ‘debunked’ or ‘refuted. Most of them are simply working it backwards now that the patterns are discovered. Not realizing that one of the objectives of these patterns is to out those would-be distorters. This tendency by many to leave the debunking to others is playing its part in preventing people from getting to the truth.
_________________________________________________________
  1. Those who wish to explore Godel’s proof further start here: An excerpt from Decoded Science.(No more active)
" A brief summary of this proof, which has five axioms that we assume to be true:1.0 Any “property”, or the negation of that property, is “positive”; but it is impossible that both the property and negation are positive.2.0 If one positive property implies that some property necessarily exists, then the implied property is positive.3.0 The property of being God-like is positive.4.0 Positive properties are necessarily positive.5.0 The property of necessarily existing is positive.
Gödel added three definitions along the way:
1.0 A “God-like” being has all positive properties.2.0 An “essence” of a being is a property that the being possesses and that property necessarily implies any property of that being.3.0 The “necessary existence” of a being means that it is necessary that all the essences of that being exist (“are exemplified”).
Gödel proved intermediate theorems and one corollary in the course of his proof. The first two axioms led to “Positive properties may possibly exist (“be exemplified”). After adding the third axiom, God, the God-like being, may possibly exist. With the help of the fourth axiom, Gödel stated that “being God-like” is one essence of any God-like being.
After adding the final axiom, Gödel concluded that it is necessary that God exists.
For those who can decipher this, enjoy:

https://preview.redd.it/3zlr1u2190b51.jpg?width=602&format=pjpg&auto=webp&s=a08c6939cff9cd1534b7e4aa000f6bc732a69113
Recently a paper was published wherein Benzmüller and Paleo state that they used several different modal logic systems to verify Gödel’s proof. Those are different logic systems, not just different computer programs. The logic systems were:
‘K’, a “weak” logical system named for logician Saul Kripke; see the next paragraph.‘B’ logic adds “A implies the necessity of the possibility of A” to ‘K’ logic.‘S4′ and ‘S5′ logic allow some simplification of repeated “possibility” and “necessity” operations."
submitted by Davidgogo to Quran_focused_Islam [link] [comments]


2020.06.25 03:13 kylespartan626 Ruby in Vagrant setup HELP!!

Hi there. I've been having this issue a while now. I graduated bootcamp and they used vagrant for teaching Rails development. That's what I'm used to. But I wanted to try to make my own Vagrant box to start my personal development environment instead of using the one the bootcamp provided. And the first project is going to be my Web Dev portfolio in Ruby on Rails.
Well now I'm running into issues installing the latest ruby version (or even the recommented one if you don't know what your'e doing which is 2.6.6 with devkit. (Not sure how I'd go about manually installing ruby with devkit in ubuntu from the terminal.)
I'm at the point where I'm ssh'd into my Vagrant box, and I'm trying to run the command rvm install 2.7.1 and it's throwing an error at me. I'm not sure how to determine exactly what's wrong, but here's the output after that command:
[email protected]:~$ rvm install 2.7.1 Searching for binary rubies, this might take some time. Found remote file https://rvm_io.global.ssl.fastly.net/binaries/ubuntu/18.04/x86_64/ruby-2.7.1.tar.bz2 Checking requirements for ubuntu. Requirements installation successful. ruby-2.7.1 - #configure ruby-2.7.1 - #download Downloaded archive checksum did not match! ruby-2.7.1 - #validate archive bzip2: Data integrity error when decompressing. Input file = (stdin), output file = (stdout) It is possible that the compressed file(s) have become corrupted. You can use the -tvv option to test integrity of such files. You can use the `bzip2recover' program to attempt to recover data from undamaged sections of corrupted files. tar: Child returned status 2 tar: Error is not recoverable: exiting now bzip2: Data integrity error when decompressing. Input file = (stdin), output file = (stdout) It is possible that the compressed file(s) have become corrupted. You can use the -tvv option to test integrity of such files. You can use the `bzip2recover' program to attempt to recover data from undamaged sections of corrupted files. tar: Child returned status 2 tar: Error is not recoverable: exiting now bzip2: Data integrity error when decompressing. Input file = (stdin), output file = (stdout) It is possible that the compressed file(s) have become corrupted. You can use the -tvv option to test integrity of such files. You can use the `bzip2recover' program to attempt to recover data from undamaged sections of corrupted files. tar: Child returned status 2 tar: Error is not recoverable: exiting now The downloaded package for https://rvm_io.global.ssl.fastly.net/binaries/ubuntu/18.04/x86_64/ruby-2.7.1.tar.bz2, Does not contains single 'bin/ruby' or 'ruby-2.7.1', Only '' were found instead. Mounting remote ruby failed with status 4, trying to compile. Checking requirements for ubuntu. Requirements installation successful. Installing Ruby from source to: /home/vagrant/.rvm/rubies/ruby-2.7.1, this may take a while depending on your cpu(s)... ruby-2.7.1 - #downloading ruby-2.7.1, this may take a while depending on your connection... Downloaded archive checksum did not match! ruby-2.7.1 - #extracting ruby-2.7.1 to /home/vagrant/.rvm/src/ruby-2.7.1...... Error running '__rvm_package_extract /home/vagrant/.rvm/archives/ruby-2.7.1.tar.bz2 /home/vagrant/.rvm/tmp/rvm_src_2151', please read /home/vagrant/.rvm/log/1593047102_ruby-2.7.1/extract.log There has been an error while trying to extract the source. Halting the installation. There has been an error fetching the ruby interpreter. Halting the installation. [email protected]:~$ 
Here is my Vagrantfile, also. Pretty basic:
Vagrant.configure("2") do config config.vm.box = "ubuntu/bionic64" config.vm.network "forwarded_port", guest: 3000, host: 3000 config.vm.provision "shell", privileged: false, inline: <<-SHELL echo "==> Installing RVM..." # https://rvm.io/rvm/install curl -sSL https://rvm.io/mpapis.asc gpg --import - curl -sSL https://rvm.io/pkuczynski.asc gpg --import - curl -sSL https://get.rvm.io bash -s stable source "$HOME/.rvm/scripts/rvm" rvm install 2.6.6 gem install rails SHELL 
config.vm.synced_folder ".", "/vagrant_files" config.vm.provider "virtualbox" do vb vb.memory = "4096" end config.vm.provider "virtualbox" do vb vb.cpus = 8 end end
Any help will be greatly appreciated. I've been trying to get help on this and this is the furthest so far I think I've gotten besides this - I did get ruby and rails to install by running sudo apt-install ruby, which gives me a really old version, and then the same with Rails when I try to install that the same way, it gives me version 4.something when it's pas 6.0.3!
If I could just get a newer version of ruby working and I can install rails, then I'm set!
submitted by kylespartan626 to ruby [link] [comments]


2020.06.17 11:40 belthazubel [Case study] Verbatim comment analysis with NLTK and Python

This post is made out of frustration. Out of frustration that when I went to our UX Research Manager of 18 years with a huge list of verbatim comments I was told, "you'll just have to code them by hand, there is no other way to do it at the moment". That was last year. I was staring at a list of 2000 UserZoom comments and was close to suicide (jk). That's when I came across the NLTK library in Python. Apart from some basic knowledge of HTML and CSS I had no prior programming experience but I'm really lazy and _really_ didn't want to code them all by hand.
I ended up analysing this particular survey by hand as told but over this last year I've been learning coding off and on and put together this script. It's been used on small scale to add extra graphs and pretty word clouds to presentations and other researchers started approaching me with requests.
Problem: large number of open text comments that need to be themed and coded by hand. Some words are misspelled and regional differences mean multiple names for the same thing (i.e. "football", "footy")
Solution: a Python script that partly codifies the comments, cleans the text, fixes misspellings and lemmatizes them into a list of individual words that can then be analysed. The output is a frequency graph and a wordcloud.
import string import matplotlib.pyplot as plt import nltk from nltk import ngrams import pandas as pd from wordcloud import WordCloud, STOPWORDS 
All the libraries I'm using:

def df_to_text(df): flat_list = [] df_str = df.applymap(str) values = df_str.values.tolist() for sublist in values: for item in sublist: flat_list.append(item) text = ", ".join(i for i in flat_list) return text 
First function turns a dataframe into text. It does it by applying a "str" function to everything inside, then for every comment in the dataframe it adds that comment to an empty list. Finally (because list items are separated with a comma) it combines the list into one chunk of text.
def clean_text(text): # turn all text to lower case text = text.lower() # initialise lemmatizer lemma = nltk.wordnet.WordNetLemmatizer() # tokenize and tag words tokens = nltk.word_tokenize(text) tagged = nltk.pos_tag(tokens) # new empty list filtered = [] # go through tagged list of words and add certain ones to the filtered list, they are also lemmatized for word, tag in tagged: if tag == "NN" or tag == 'VBG' or tag == 'VB' or tag == 'NNS': filtered.append(lemma.lemmatize(word)) # turn the lists back into a string filtered_str = ", ".join(i for i in filtered) # some last minute text processing filtered_str = string.capwords(filtered_str) filtered_str = filtered_str.replace('Tv', 'TV') filtered_str = filtered_str.replace('Diy', 'DIY') filtered_str = filtered_str.replace('Children', 'Kids') filtered_str = filtered_str.replace('Kid', 'Kids') filtered_str = filtered_str.replace('Kidss', 'Kids') filtered_str = filtered_str.replace('Travelling', 'Travel') filtered_str = filtered_str.replace('Bbq', 'BBQ') filtered_str = filtered_str.replace('Parc', 'Park') return filtered_str 
Next function cleans the chunk of text we created earlier. The comments are pretty self explanatory, I'd like to just say that the last minute processing is there to fix any common typos that were not picked up earlier. It's a dirty method of doing it but I'm a noob so it's fine.
Also a note on tokens and tags. Tokenising is just breaking sentences into words. For example "Hello there" will become a list of "hello" and "there". Tagging assigns a type to the word, e.g. noun, verb, etc. Lemmatizing will bring a word back to its common meaning, e.g. "running, ran and runner" will become "run, run, run".
def generate_cloud(filepath, dictionary, title="figure_" + str(number),mask_filepath="None"): global number #ignore this, it's for file naming # Import data df = pd.read_csv(filepath) print(df[0:5]) # Turn columns into free text text = df_to_text(df) text = clean_text(text) # Create stopword list stopwords = set(STOPWORDS) stopwords.update(dictionary) # Generate WordCloud image wordcloud = WordCloud( font_path="fonts/OpenSans.ott", stopwords=stopwords, background_color="white", width=1920, height=1080,contour_width=0, min_font_size=8, random_state=1).generate(text) # Generate graph plt.figure(dpi=72, figsize=(13.65, 10.24)) plt.title(title) plt.imshow(wordcloud, interpolation='bilinear') plt.axis('off') plt.show() # Save to file wordcloud.to_file( "img/" + title + ".png") # Add 1 to counter number += 1 print("Wordcloud generated OK") 
This bit is fairly out of the box simple. Import a CSV file as a dataframe. Turn it into the list and clean it with the two functions we defined above and spit out a wordcloud. There is another function that counts word frequency:
def show_freq(file, title="Figure_" + str(counter)): global counter # import df df = pd.read_csv(file, index_col=0) # empty list flat_list = [] # make sure that all cells are strings and extract all values (outputs a list of lists) df_str = df.applymap(str) values = df_str.values.tolist() # turns a list of lists into a single list for sublist in values: for item in sublist: flat_list.append(item) # turn list into text and lowercase eveyrthing text = " ".join(i for i in flat_list) text = text.lower() # tokenize words and add tags tokenizer = RegexpTokenizer(r'\w+') tokens = tokenizer.tokenize(text) tagged = nltk.pos_tag(tokens) # empty list that will hold the final filtered output filtered = [] # lemmatize the list turning 'kids' into 'kid' and 'cars' into 'car' etc. lemma = nltk.wordnet.WordNetLemmatizer() # look for nouns and verbs and add them to the filtered list # this is so we don't end up with As and Is all over the place for word, tag in tagged: if tag == "NN" or tag == 'VBG' or tag == 'VB' or tag == 'NNS': filtered.append(lemma.lemmatize(word)) # create stopwords that will be automatically be dropped # this is to avoid any junk like 'Go' which is valid but not useful sw = nltk.corpus.stopwords.words('english') newStopWords = ['remember', 'going', 'something', 'go', 'none'] sw.extend(newStopWords) # empty list to store words minus the stopwords words_ns = [] # add words to the new list if they are not part of the stopword dictionary for word in filtered: if word not in sw: words_ns.append(word) # count frequencies nlp_words = nltk.FreqDist(words_ns) # display resulting graph and save PNG plt.figure(figsize=(10.2, 7.68), dpi=100) plt.title(title) nlp_words.plot(20) plt.show() 
As you can see it's pretty similar to the wordcloud one, and could potentially be rewritten a bit more neatly.
Outputs:
Fig.1: Word frequency in question \"What is the most frequent cause for arguments in your household?\"

Fig.2: Same question in form of a wordcloud made into the shape of the UK. I abandoned the UK map in the future iterations because it was misleading people into thinking that it's spread over geographical frequency.

Next steps: read up on sentiment analysis as currently it's the missing piece. Output a graphs of top positive sentiment and top negative sentiments. After that I'd like to read more about adding a theme to the comments.
Conclusion: hopefully this small case study proves the value of natural language processing for UX research. I started learning Python out of frustration and yes, I did spend a year learning where I could have coded the comments myself in a week, buuuut it was fun and useful not only for me but for other researchers too. So time well spend, I'd say. Also if anyone does know a lot about Python programming – please forgive my hodge podge code. If you take one thing away from this let it be this: approach your data scientists for some help with analysing verbatim comments. There are cool machine learning and predictive models out there that can help.
And yes, I know "don't ask people for verbatim comments". However, sometimes it's unavoidable.
Hope this was useful. Sorry mobile users :)
submitted by belthazubel to UXResearch [link] [comments]


2020.06.11 15:20 theduckspants Open Source Project: Userform Validations

In the spirit of continuing to post VBA source code for projects I've worked on over the years in case they are useful to anyone, today I'm sharing my reusable classes for creating userform validation on inputs. This project could easily be ported to Access, or even VB.net.
Userform Validations
If you are using Userforms to collect data from folks, validating the input data very important, otherwise you'll spend a long time cleaning up your data to make it useful.
These classes will allow you to declaratively configure input validations for custom userforms in Excel using an array of pre-built options.

Features
Gitlab Repo
Example code, a working example spreadsheet, and other documentation can be found here:
https://gitlab.com/dc_excel/validations

Previous Posts
If you missed it, here's the previous post on being able to format charts using formatting or conditional formatting on the cells containing the chart's data: https://www.reddit.com/excel/comments/gzr87y/formatting_charts_by_formatting_source_cells/
submitted by theduckspants to excel [link] [comments]


2020.06.09 15:50 AfricanJuju Class Inheritance and Controls

I am playing on vb.net to practice some inheritance basics, and need to know the best workflow for what I want to do.
Winform project.
I have a class called "VerifiedInputBox". It has properties: Label (as Label), InputBox (As control), and PictureBox (As PictureBox). I will have many of these Verified control sets on my winform. The InputBox will either be a ComboBox or TextBox (depending on what is needed at the time).

These items are one object that can be hidden or shown as once block as needed. The idea is that I want to parse the inputs from the input box and the PB will show a green or red image based on if the input is good or bad.
VerifiedInputBox is supposed to be a base type that I can derive VerifiedTextBox and VerifiedComboBox from. Then my Verify method will parse the TB for a double, or the CB for anything that is valid (based on other criteria).
Looks like you can't override a property to one with a different type. What is the best way to achieve my goal?
submitted by AfricanJuju to dotnet [link] [comments]


2020.06.03 01:33 c33s How to compile a php 7.4 extension for windows and which visual studio components to install?

TLDR (core questions):
  1. what are the minimum necessary components to install in visual studio community 2017 to compile a php 7.4 extension?
  2. how to compile the extension https://github.com/openvenues/php-postal for php 7.4 on windows for an old window7 pc?
  3. do i need windows7 sdk to be able to run the extension on an old windows7 computer?
  4. is it possible to compile a windows extension (dll) on a linux computer? docker? (i am really looking for an alternative where i don't have to download so big apps only to get a cli compiler.)
  5. after installing visual studio 2017 in a minimum variant i only have Developer Command Prompt for VS 2017 and no VS2015 x64 Native Tools Command Prompt as stated in https://wiki.php.net/internals/windows/stepbystepbuild_sdk_2
i ask the question here because the visual studio downloader is terrible slow (so no easy download install/uninstall test), the requirement for registration everywhere (including phone number) to access resources and downloads, the unavailablilty of resources for older operation systems or program versions and the undetailed manuals i found.
long story:
with trying to compile php-postal https://github.com/openvenues/php-postal as windows php extension i took a quite time consuming journey with no end in sight, maybe someone can help. my plan would be to make a merge request to update the build manual with my findings.
i followed the manual from the php wiki https://wiki.php.net/internals/windows/stepbystepbuild_sdk_2
which tells me that it's not supported to compile php 7.4 with visual studio 2019.
Visual C++ 14.0 (Visual Studio 2015) for PHP 7.0 or PHP 7.1. Visual C++ 15.0 (Visual Studio 2017) for PHP 7.2, PHP 7.3 or PHP 7.4. Visual C++ 16.0 (Visual Studio 2019) for master. 
having a closer look on the supported versions link https://wiki.php.net/internals/windows/compiler
Visual C++ 14.00 (2015) Visual C++ 14.10 (2017) Visual C++ 14.20 (2019) 
tells me, that it should work with visual studio 2019, so i tried to install visual studio 2019 but it really has quite a lot of components. which one to install? i tried to pick the components which sound most plausible and tried to keep to an absolute minimum.
then i head over to https://github.com/Microsoft/php-sdk-binary-tools to download the php-sdk
i tried to run it with my picked components and failed, then i wanted to switch to visual studio 2017 to be as close as possible to the manual. next problem was, that microsoft requires me to create an account to download the older visual studio version. sidestory then they locked my old account, because i tried to login having uMatrix and uBlock on, which i disabled part for part only to get to an account locked page. where is my wget https://microsoft.com/vs/vccompiler2015.zip which works even if the version is not supported any more.
the DX (developer experience) is really awful. to unlock they want my phone number. why?! then i noticed that i can install visual studio 2017 with the webinstaller from visual studio 2019. so no account needed any more. should i mention that before i was able to uninstall visual studio 2019 i had to update it. the 70mb file to download took me 4hours (3 times the download was stopped and then i waited for 2hours then it worked. slow but it worked. in the time i download a 70mb file from microsoft to uninstall an application i download a 30gb game from steam and also have some levels finished)
after the deinstallation of vs 2019 i installed vs 2017 with the following components:
C# and Visual Basic Roslyn compilers MSBuild Static analysis tools .NET Framework 4.6.1 SDK .NET Framework 4.6.1 targeting pack Text Template Transformation Visual Studio C++ core features Visual C++ 2017 Redistributable Update Visual C++ core desktop features VC++ 2017 version 15.9 v14.16 latest v141 tools C++/CLI support VC++ 2017 version 15.4 v14.11 toolset
on https://github.com/Microsoft/php-sdk-binary-tools there the following requirements are listed:
Visual C++ 2017 or Visual C++ 2019 must be installed prior SDK usage. Required components C++ dev Windows SDK .NET dev 
  • what is "C++ dev" is it "VC++ 2017 version 15.9 v14.16"? something else?
  • which .net dev is required? why i need .net for a c php extension?
  • what about the windows sdk?
vs 2017 installer shows only the following sdks:
Windows 10 SDK (10.0.17763.0) Windows 8.1 SDK 
so windows7 support is over and so microsoft decided to remove all diver and all stuff? sadly the extension must run on an old windows7 pc, so can i compile such an extension on the old windows 7 pc itself? is it possible to compile it on my windows 10 workstation without having a windows 7 sdk?
full list of visual studio 2017 components from the 2019 web installer:
``` .NET .NET Core runtime .NET Framework 3.5 development tools .NET Framework 4 targeting pack .NET Framework 4.5 targeting pack .NET Framework 4.5.1 targeting pack .NET Framework 4.5.2 targeting pack .NET Framework 4.6 targeting pack .NET Framework 4.6.1 SDK .NET Framework 4.6.1 targeting pack .NET Framework 4.7 SDK .NET Framework 4.7 targeting pack .NET Framework 4.7.1 SDK .NET Framework 4.7.1 targeting pack .NET Framework 4.7.2 SDK .NET Framework 4.7.2 targeting pack .NET Native .NET Portable Library targeting pack Advanced ASP.NET features
Cloud, database, and server Azure Authoring Tools Azure Cloud Services build tools Azure Cloud Services core tools Azure Compute Emulator Azure Data Lake and Stream Analytics Tools Azure development prerequisites Azure libraries for .NET Azure Mobile Apps SDK Azure Resource Manager core tools Azure Storage AzCopy Azure Storage Emulator Cloud Explorer CLR data types for SQL Server Connectivity and publishing tools Container development tools Container development tools - Build Tools Data sources and service references Data sources for SQL Server support IIS Express Microsoft Azure WebJobs Tools Redgate SQL Search Service Fabric Tools SQL ADAL runtime SQL Server Command Line Utilities SQL Server Data Tools SQL Server Express 2016 LocalDB SQL Server Native Client Web Deploy
Code tools Class Designer ClickOnce Publishing Dependency Validation Developer Analytics tools DGML editor Git for Windows GitHub extension for Visual Studio Help Viewer LINQ to SQL tools NuGet package manager NuGet targets and build tasks PreEmptive Protection - Dotfuscator Static analysis tools Text Template Transformation
Compilers, build tools, and runtimes .NET Compiler Platform SDK C# and Visual Basic Roslyn compilers C++ Universal Windows Platform tools for ARM64 C++/CLI support Clang/C2 (experimental) IncrediBuild - Build Acceleration Modules for Standard Library (experimental) MSBuild Python 2 32-bit (2.7.14) Python 2 64-bit (2.7.14) Python 3 32-bit (3.6.6) Python 3 64-bit (3.6.6) Runtime for components based on Node.js v6.4.0 (x86) Runtime for components based on Node.js v7.4.0 (x86) Runtime support for R development tools VC++ 2015.3 v14.00 (v140) toolset for desktop VC++ 2017 version 15.4 v14.11 toolset VC++ 2017 version 15.5 v14.12 toolset VC++ 2017 version 15.6 v14.13 toolset VC++ 2017 version 15.7 v14.14 toolset VC++ 2017 version 15.8 v14.15 toolset VC++ 2017 version 15.9 v14.16 latest v141 tools VC++ 2017 version 15.9 v14.16 Libs for Spectre (ARM) VC++ 2017 version 15.9 v14.16 Libs for Spectre (ARM64) VC++ 2017 version 15.9 v14.16 Libs for Spectre (x86 and x64) Visual C++ 2017 Redistributable Update Visual C++ compilers and libraries for ARM Visual C++ compilers and libraries for ARM64 Visual C++ runtime for UWP Visual C++ tools for CMake Windows Universal CRT SDK Windows XP support for C++
Debugging and testing .NET profiling tools C++ profiling tools JavaScript diagnostics Just-In-Time debugger Test Adapter for Boost.Test Test Adapter for Google Test Testing tools core features WebSocket4Net
Development activities ASP.NET and web development tools C# and Visual Basic C++ Android development tools C++ iOS development tools Cookiecutter template support Embedded and IoT Development F# desktop language support F# language support F# language support for web projects JavaScript and TypeScript language support JavaScript ProjectSystem and Shared Tooling Microsoft R Client (3.3.2) Mobile development with JavaScript core features Node.js development support Node.js MSBuild support Office Developer Tools for Visual Studio Python IoT support Python language support Python web support R language support Razor Language Services Visual C++ for Linux Development Visual C++ tools for CMake and Linux Visual Studio C++ core features Visual Studio Tools for Office (VSTO) Windows Communication Foundation Windows Workflow Foundation Xamarin Xamarin Remoted Simulator Xamarin Workbooks
Emulators Google Android Emulator (API Level 23) (global install) Google Android Emulator (API Level 23) (local install) Google Android Emulator (API Level 25) Google Android Emulator (API Level 27) Intel Hardware Accelerated Execution Manager (HAXM) (global install) Intel Hardware Accelerated Execution Manager (HAXM) (local install)
Games and Graphics Cocos Graphics debugger and GPU profiler for DirectX Image and 3D model editors Unity 2018.3 64-bit Editor Unreal Engine installer Visual Studio Android support for Unreal Engine Visual Studio Tools for Unity
SDKs, libraries, and frameworks Anaconda2 32-bit (5.2.0) Anaconda2 64-bit (5.2.0) Anaconda3 32-bit (5.2.0) Anaconda3 64-bit (5.2.0) Android NDK (R11C) Android NDK (R11C) (32bit) Android NDK (R12B) Android NDK (R12B) (32bit) Android NDK (R13B) Android NDK (R13B) (32bit) Android NDK (R15C) Android NDK (R15C) (32bit) Android SDK setup (API level 19) (local install for Mobile development with JavaScript / C++) Android SDK setup (API level 21) (local install for Mobile development with JavaScript / C++) Android SDK setup (API level 22) (local install for Mobile development with JavaScript / C++) Android SDK setup (API level 23) (global install) Android SDK setup (API level 23) (local install for Mobile development with JavaScript / C++) Android SDK setup (API level 25) Android SDK setup (API level 25) (local install for Mobile development with JavaScript / C++) Android SDK setup (API level 27) Apache Ant (1.9.3) Blend for Visual Studio SDK for .NET Cordova 6.3.1 toolset Entity Framework 6 tools Graphics Tools Windows 8.1 SDK Microsoft distribution OpenJDK Modeling SDK TypeScript 2.0 SDK TypeScript 2.1 SDK TypeScript 2.2 SDK TypeScript 2.3 SDK TypeScript 2.5 SDK TypeScript 2.6 SDK TypeScript 2.7 SDK TypeScript 2.8 SDK TypeScript 2.9 SDK TypeScript 3.0 SDK TypeScript 3.1 SDK USB Device Connectivity Visual C++ ATL (x86/x64) with Spectre Mitigations Visual C++ ATL for ARM Visual C++ ATL for ARM with Spectre Mitigations Visual C++ ATL for ARM64 Visual C++ ATL for ARM64 with Spectre Mitigations Visual C++ ATL for x86 and x64 Visual C++ MFC for ARM Visual C++ MFC for ARM with Spectre Mitigations Visual C++ MFC for ARM64 Visual C++ MFC for x86 and x64 Visual C++ MFC for x86/x64 with Spectre Mitigations Visual C++ MFC support for ARM64 with Spectre Mitigations Visual Studio SDK Windows 10 SDK (10.0.10240.0) Windows 10 SDK (10.0.10586.0) Windows 10 SDK (10.0.14393.0) Windows 10 SDK (10.0.15063.0) for Desktop C++ [x86 and x64] Windows 10 SDK (10.0.15063.0) for UWP: C#, VB, JS Windows 10 SDK (10.0.15063.0) for UWP: C++ Windows 10 SDK (10.0.16299.0) for Desktop C++ [ARM and ARM64] Windows 10 SDK (10.0.16299.0) for Desktop C++ [x86 and x64] Windows 10 SDK (10.0.16299.0) for UWP: C#, VB, JS Windows 10 SDK (10.0.16299.0) for UWP: C++ Windows 10 SDK (10.0.17134.0) Windows 10 SDK (10.0.17763.0) Windows 8.1 SDK Windows Universal C Runtime ```
submitted by c33s to PHPhelp [link] [comments]


2020.05.27 20:43 timmmy8 Grand Theft Auto V: A Look Back at the Major Leaks

Here we go again. Buckle in. This post is a biggie, this time covering Rockstar’s latest entry in the series about grand theft…and auto, "Grand Theft Auto V". There were so many questions about what possible direction this franchise could go, with many rumours seemingly just spouting nonsense and seeing what gained traction, but there were some common themes amongst the leaks. Whether this meant that this was all genuine information, or whether they just all started copying each other, no one will know – although I believe in the latter.
Of course, naturally, with this being Rockstar’s biggest franchise and people desperate for just about any piece of information they could get, legitimate or otherwise, there were hundreds of rumours and leaks for this game, and I will do my best to sift through the endless supply of such and talk about the ones worth mentioning.
Let’s jump in, and here is a spoiler warning just in case.

May 4, 2010 – E3 Leak Reveals “Grand Theft Auto: Vice City 2”

Way back in 2010, Game Reactor shared the alleged lineup for 2010’s E3 event, detailing many games that are going to be revealed, including “Grand Theft Auto: Vice City 2”. Other interesting mentions in this “leak” include a new “Half-Life”, a sequel to “Bully”, and a premature announcement of “Kingdom Hearts 3”.
Outcome? Fake. Never trust E3 leaks.

26 July, 2010 – Is "GTAV" Heading to Hollywood?

VG247 shares with the world the first real hint that the game is headed back to America’s sun-spoilt West Coast. While the article linked does have the tease in a conversation format, it does make mention of Hollywood and that an announcement could be coming soon. Separately, it seems Eurogamer reached out to their own sources and were able to confirm that while Rockstar had been scouting out the Hollywood area, they were unable to confirm for what actual franchise it was for. This wouldn’t be the first time Rockstar has taken "GTA" to Los Angeles, with "San Andreas" already staking that claim.
Outcome? Confirmed.
As we all know, "GTAV" was set in Los Angeles, and this was our first clue to such.

February 28, 2011 – Rockstar Registers Web Domains

Courtesy of XboxAchievements, due to the original source being taken down, readers are able to treat themselves to a handful of domain names that Rockstar had publicly registered. While on the surface these do not appear to have any mention on the game, as correctly speculated in the article, these turned out to be related to in-game websites and businesses. The one’s registered were;
CashForDeadDreams.com - buy second-hand items from the elderly
SixFigureTemps.com - a job site to make money fast
HammersteinFaust.com - an employment firm business in the game
LifeInvader.com -the game's social networking service
The only one registered that doesn’t seem to make an appearance in the game is StopPayingYourMortgage.net, although typing this into an actual browser will take you to Rockstar’s "GTAV" site.
Outcome? Confirmed. As we can see, majority of these actually end up in the game, and I am sure I didn’t even have to provide proof for LifeInvader.

March 8, 2011 – Casting Call Leaked, Rockstar’s Next Game Codenamed “Rush”

GameWatcher reports that Rockstar have put out a casting call for an “interactive project”, which has been code-named “Rush”. The call seeks performers for the following roles;
Mitch Hayes – 38 yrs old – Annoying, wise cracking, highly successful FBI agent. In great shape. Does triathlons, drinks low cal beer, but still has a sense of humor.
Miguel Gonzalez – 25 yrs old – Young Mexican American FBI agent, caught between a few mob bosses. Very clean cut
Clyde – 23 yrs old – Moronic, almost inbred and creepy white trash hillbilly. Very naïve but in a creepy ‘it’s only incest sort of way’
Brother Adam – 50 yrs old – Welsh monk, cult leader, yoga teacher, very lithe, very into exploring your personal tension through gripping massage. Needs Welsh accent.
Mrs Avery – 48 yrs old – Neurotic soccer mom, home maker, anxious and addled on pain killers. Very angry at neighbor MRS Bell.
Mrs Bell – 45 yrs old – Swinger, and mellow Californian divorcee. Ugly but comfortable with self.
Eddie – 47 yrs old – Weed evangelist, guy who started smoking at 30, and is now a leading proponent of marijuana’s fantastic properties. White, awkward.
Ira Bernstein – 56 yrs old – publicist for an actress known as America’s newest sweetheart who just so happens to love animals, orphans, drugs and sex. He’s always trying to hide her latest indiscretion.
Kevin De Silva – 18 yrs old – Albert’s fat, FPS playing gamer son. Smokes a lot of weed, has anxiety issues and a card for a bad back, very soft, very opinionated. Into making racist comments while playing online.
Harut Vartanyan – 42-52 yrs old – Armenian car dealer, moneylender, would be Fagin and would be bully. Heavily connected to the underworld, but irritates people so much no one likes him.
Nervous Jerry – 48 yrs old – paranoiac living in the sticks, near Simon, completely paranoid, and terrified of Simon.
Calvin North – 55 yrs old – clapped out FBI agent who now mostly works offering advice on TV shows – whose only claim to fame turns out to be entirely false – but a decent guy in other ways. Badly dressed. Divorced. Putting on weight.
Jerry Cole – 53 yrs old – disabled IT expert and criminal information vendor.
Rich Roberts – 35 yrs old – English hardman actor, who acts tough but who wants to do serious work – the only problem is he can’t quite read the words.
Alex – 52 yrs old – white, loosie goosie hippy rich guy who has lost his money and is getting desperate but trying not to.
Scarlet – 45-52 yrs old – unshaven female spiritualist and hippy with a love of exploring the wilderness. Very into journeys.
Chad – 29 yrs old – pretty boy misogynist Beverly Hills party boy. Made money, but not as cool as he thinks he is.
Tae Wong – 39 yrs old – somewhat incompetent Chinese mobster, loves doing ecstasy, going to raves.
Taes Translator – 45 yrs old – VERY STRAIGHT LACED Chinese translator, terrified of his boss’s dad. Male, awkward. Needs to speak Chinese.
A big thank you to GTA Fandom for being able to compare the casting call with the game’s final release. Here is a table for those who want to see it;
Comparison between the casting call and the in-game characters
What else is interesting is that Trevor Phillips, one of the game’s main protagonists, was referred to as Simon here, while Albert De Silva in-game is instead Michael De Santa.
Outcome? Confirmed. Although the names have changed, you can definitely see the resemblance of many of the characters in the game, and that the majority did appear albeit under a different name.

March 29, 2011 – Stuntman “Typo” Places "GTAV" on his Resume

Declan Mulvey just might have made a typo when he placed "Grand Theft Auto V" on his resume, saying he did stunt work for the game. However, once eagle-eyed internet sleuths noticed this, according to Eurogamer he told CVG (which I cannot find the article), that it was simply a typo and meant to write “Grand Theft Auto IV”. What is interesting is that this was never corrected, and that he was never in the credits for that game - I think he simply made an oopsie.
Outcome? Confirmed. Definitely not a typo, as he is credit in "GTAV "and is not credited in "GTAIV".

June 5, 2011 - Play as a Cop in "GTAV"

As shared on GTAForums, one very controversial rumour regarding the game is that it would feature a story where you play as a rookie cop, working your way up in the ranks to either being the biggest detective in the city, or a cop-gone-rogue. I won’t post the whole “leak” word-for-word, but it sounds like it would have played a lot like L.A. Noire, Rockstar’s detective game also set in Los Angeles.
Based on the rumour, you would start the game as a rookie cop fresh out of the academy, named “Rock or Brock”, and as the story moves along, the player would find himself challenged by his partner who is working in the criminal underworld. The further you progress, the more you would find out that your partner is dirty, but you have a choice to either work with him illegaly or to investigate him - with the endgame resulting in either the player becoming the captain of the police force, or a “dirty cop Kingpin”. Post game, you would continue performing these roles, either abiding and enforcing the law as the captain, or selling contraband and performing other illegal tasks as a kingpin.
Gameplay wise, you would have to respond to dispatches over the radio, perform traffic stops, aid civilians, participate in car chases and even menial traffic tasks such as fining those with faulty brake lights, or speeding. The further you progress, the more involved you get with drug dealers, pimps, and organised crime. If a player wants to stray to the dark side, they can plant and steal evidence, beat informants, and sell drugs and guns.
Other minor details include the return of some characters from San Andreas such as “CJ”, being able to go through SWAT training, helicopter training, performing traffic duty, being able to carry a baton, mace, and taser, and being able to handcuff characters through rotating the analogue sticks.
Obviously, you will learn more details about this leak by opening the above forum post, but at the time this was not well received by all, some questioning why a game made famous for allowing players to commit the crimes they want, would now have them play on the other side of the law.
Outcome? Fake. Nothing turned out to be true, although it did “guess” that the weapon wheel from Red Dead: Redemption would return.

June 20, 2011 – 2012 Release “Pretty Likely” for "GTAV"

Sources close to Rockstar Games have confirmed with GameSpot that development is “well under way”, and that a 2012 release is looking pretty likely. Additionally, Gamespot reports that the final touches are being worked on now, such as minigames, and that the scale of the game is vast, saying “It’s the big one”.
Outcome? Plausible. As we know, the game launched in 2013, and it was a “big one”.

October 25, 2011 – Rockstar Announce "Grand Theft Auto V"

Rockstar Games announce "Grand Theft Auto V" on Twitter.

October 25, 2011 – Kotaku Confirm LA setting, Possible Multiple Playable Characters

On the same day that the game is announced, Kotaku is able to confirm that "GTAV" will be set in Los Angeles, according to their source who is familiar with the game. As well as talking about the setting, they have other sources that the game will feature multiple playable characters -something that was somewhat touched on with "GTAIV’"s expansions. Not much else to report here.
Outcome? Confirmed. The game, as we all know, features three playable characters and is set in Rockstar’s version of LA.

November 2, 2011 – "Grand Theft Auto V" Trailer drops

Get nostalgic here!

November 4, 2011 – Los Santos Map Leaked by Employee?

Thanks to iGTA5 we can see that an employee apparently shared a version of the game’s map on Twitter, which is viewable right here. It does show Los Santos and Vinewood, but we know that this map isn't an accurate representation of the released version - it could be an early version, but I doubt it. Shortly after posting, the account named “toronotoJack233” got deleted.
Outcome? Fake.

November 5, 2011 – UK Magazine Leaks "GTAV" Information

As reported on VG247, it seems that an employee of a Playstation-focused magazine has leaked some information about the recently announced "GTAV". There is quite a lot of information, but it basically comes down to;
It is possible that this is a legitimate leak, some of the points made are representative of some aspects of the final game. However, just looking we can see stuff like rock-climbing, canoeing, and abseiling did not feature, or neither did earthquake tremors. There also was no need to focus on refuelling vehicles, nor was there the ability to use human shields in combat.
Outcome? Plausible. It is possible that some of this information genuinely came about as a leak, although I don’t feel confident enough in the content to say a verified leak – especially as this came after the trailer.

November 8, 2011 – More “Leaks” at GTAForums

Another big pile of leaks, this time from a user called OpenSuvivor, and now on the GTAForums. However, the post had been deleted pretty swiftly by the forum moderators, so we will be using this reddit post as our source of information.
Another big list of stuff that I encourage you to read, even if just for old time’s sake. Some interesting notes though mention;
As we know, 99% of this list is just completely inaccurate, making it easy to determine the validity of this “leak”. Having said that, it’s still fun to read, and of course at the time there is just no way of knowing if it really is fake.
Outcome? Fake. I don't think it is coincidence that these recent fake leaks all came out after the trailer.

March 28, 2012 – Former Rockstar Employee Reveals Information

Another “insider” leak, this time coming from “a friend of someone who recently got sacked from Rockstar North for general misconduct”. While the original document is no longer viewable, it is possible to find out what was written thanks to Playstation Lifestyle. Straight away, looking back, we can make a pretty quick judgement about the validity of this “leak”.
Firstly, they mention that the protagonist “will be one character, and one character alone”. This character’s name is Albert De Silva, and he has a kid Kevin who is pretty much your typical gamer who smokes weed. Our protagonist is the man that we saw in the first trailer, and he will not die at the end of this game.
Next we know that multiplayer lobbies can hold 32 players on Xbox 360 and PS3, and that you will be able to form gangs that level up with reputation, rather than XP. There is an underworld that has a working economy, and the players can take drugs – which will have side effects.
We learn (again) that the map is five times as large as the "GTAIV" map, and that planes are now flyable in this game, unlike the previous "GTA" entry. Guns and cars will also be customisable in this game, allowing suppressors on weapons, and nitrous on cars. Gunfights are now meant to be more realistic, and shooting will be more difficult out of cars due to shaking cameras.
The most interesting piece of information is the mention of the game aiming to be released in May 2013. A release date hadn’t been formally announced at this point by Rockstar, but an end of 2012 release I believe was the consensus at the time.
Outcome? Fake. Some of these points are a mix whole truths and entirely inaccurate, while some also blur the line. For example, they got the main character part correct (as we know that Albert was Michael’s name in development), but they were incorrect in stating one character and that he will not die (for as we know it is possible he can). Multiplayer lobbies only held 16, and there was no underworld economy.

April 10, 2012 – Rockstar Employee’s CV Leaks October 2012 as Release Date

Character animation developer for Rockstar, Alex O’Dwyer, mentions on his CV that he had worked on "Grand Theft Auto V", and that it is expected to release during October 2012, as viewable here. Since this was spotted, it has since been removed, with no comment from O’Dwyer or Rockstar. What is interesting, is that if this is also a legitimate leak, it would have also confirmed a PC version of the game.
Outcome? Confirmed. While the launch dates did end up being incorrect, the mention of the PC version and taking into consideration that he did work on the game, I believe that it is possible that October 2012 was a launch goal internally at Rockstar – at least at one point.

May 15, 2012 – "GTAV" Vehicle List Found in "Max Payne 3"

A user on GTAForums had allegedly found the vehicle list for "Grand Theft Auto V" within the game files of "Max Payne 3", outlining the types of trains, cars, boats, helicopters, and bikes that players would be able to use. Vehicles of note included a cable car, a chair lift, a ski-mobile, and an APC. If memory serves me correctly, these are some of the vehicles mentioned that did not make it into the final game, while on the other hand there are dozens of vehicles that are not mentioned here that are in the game.
Outcome? Fake.

September 9, 2012 – Our First Gameplay Leak?

A popular video that I personally remember doing the rounds before the game’s release was this video here, which shows a car being driven through the incredibly detailed desert, before getting into a helicopter. The map shows an incredibly detailed mountain range, and a desert populated with scenery. It definitely seems like something that would be in the style of the "GTA" series, and one I somewhat believed when I first saw it.
Outcome? Fake. As it turns out, this was just a fan-made video, that has gone over multiple name changes throughout the years – I believe a comment mentioned that this title had once been renamed to suggest it was a "GTAVI" leak.

October 28, 2012 – Polish Site Leaks “GTAV” Promotion Material and Release Date

The promotional material and release date had seemingly been made public – not through Rockstar but a Polish site (a site from Poland…not that stuff you shine stuff with). The posters include Franklin and his dog (at the time I believe he wasn’t revealed), as well as characters preparing to rob the jewellery store (revealed days earlier, per Kotaku). The tagline on the posters suggest that the game is available in Spring 2013 – (March, April, May for us southern hemisphere-ers), which at the time seemed likely as it was pretty clear the game was not coming out in 2012. It is also interesting that “Red Dead: Redemption” was also released in Spring of 2010 – so it isn’t entirely unfeasible to suggest the same for “GTAV”.
Outcome? Confirmed. I believe this is a genuine leak, the artwork is too legitimate, as well as the release date being able to be confirmed by the following…

October 30, 2012 – Rockstar Announce Spring 2013 Release Date

Rockstar Games are “proud to announce that Grand Theft Auto V is expected to launch worldwide spring 2013 for Xbox 360 and Playstation 3”.

January 31, 2013 – Rockstar Announce September 17, 2013 Release Date

Well that was quick.

August 23, 2013 – “GTAV” files discovered on Playstation Store

As Venture Beat reports, users who had preordered “GTAV” on the European version of the PlayStation Store were able to download some files as they had become available on the 22nd of August. Users mined these details, being able to discover the game's soundtrack (here is a reddit thread about it), while the game’s main theme was leaked as well - video of such has since been deleted. Due to the nature of the files, the leaks were primarily audio files, and did not leak to any actual gameplay leaks.
A day later, Playstation provided a comment on the matter, saying;
Regrettably, some people who downloaded the digital pre-order of Grand Theft Auto V through the PlayStation Store in Europe were able to access certain GTA V assets. These assets were posted online. We have since removed the digital pre-order file from the PlayStation Store in Europe. We sincerely apologize to Rockstar and GTA fans across the world who were exposed to the spoiler content. GTA V is one of the most highly anticipated games of the year with a very passionate following, and we’re looking forward to a historic launch on September 17.
Outcome? Confirmed.

September 10, 2013 – Strategy Guide Leaks Map

The world were able to discover just how big the game was when the map for the game was taken from the game’s official strategy guide. The map was posted onto reddit for all to see right here, and many were impressed – one user liking the map to a “Teenage Mutant Ninja Turtle”. The guide was originally meant to release with the game’s release on the 17th of September, but it seems once retailers received their copies they were able to scan and share some of the information.
Outcome? Confirmed.

September 17, 2013 - Grand Theft Auto V Released

September 22, 2013 – Micro-transactions Discovered

Reddit user u/1880 seemed to discover why it was so hard for players to earn cash in the game and discovered a file that references “cash cards”, ranging from $100,000 to $1,250,000. A copy of the file was uploaded here. However, these "cash cards" were unable to be accessed – despite the game being playable. The original post is quite hopeful that it is just something that was scrapped in development as it “seems very un-Rockstar-y” to the poster.
Unfortunately, reddit user u/Nouveau_Compte was able to provide proof that it was only for online, shared on the same thread.
Outcome? A sad confirmation.

Closing Thoughts & Some Housekeeping

That was a big read, featuring a good mix or confirmed and fake leaks - hopefully giving you the hint to stay vigilant as rumours start to increase in frequency for Grand Theft Auto VI. Having said that, you'll never know what you read that does turn out to be true. Here are a couple other links that you might also find interesting, but didn't include in the post;
GTA V Location Teased in GTAIV Manual
New Casting Call for GTAV
It seems Rockstar games seem to have a high number of rumours, leaks, and just plain lies, given their reputation and the popularity of their games. I can also tell you, the next GTA game that I am working on a post for, already has more sources and "leaks" than I found for GTAV...but this post won't be ready any time soon.
Some other housekeeping notes, I just want to find out a few things;
Other things, these posts might start slowing down - but fear not! They are still being worked on, got a few more in the pipeline. I am also going to limit these posts to games that have only been released. For those who are asking, the idea of a video format is also being explored.
Here are some previous editions of this series;
Thanks for reading, appreciate any and all feedback.
Cheers!
submitted by timmmy8 to Games [link] [comments]


2020.05.21 17:41 Grimson89 Problem Deploying Visual Studio

Hello All,I am looking for some help or guidance deploying Visual Studio 2010. I was successful in creating the Application and deploying it with SCCM using the install program ".\Setup\setup.exe /q /UnattendFile ..\VS2010_deployment.ini". However when installing it fails with a 1603 error code. From the logs it looks like it is failing to install two dependencies: TFS Object Model (x64), and .NET Framework 4 Multi-Targeting Pack. If I install the application with the GUI it works but when I try to silently install it I get these errors. Below in the spoiler is where I think the problem is occurring in the install log, the link below is a folder with all the setup logs I could find in it. Wondering if anyone has seen this before and would have advice on how to remedy the situation.
https://drive.google.com/drive/folders/1j8SI9jSNT8PIvWoHLWTYjhXyA40ZsjXA?usp=sharing

[05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: Initializing command line for MsiInstallProduct: VSEXTUI="1" SETUPWINDOW="0" PIDKEY="W2W2H2VP9BMHVH93YTGR7MTHJ" INSTALLLEVEL="2" REBOOT="ReallySuppress" ProductID="01018-569-0492355-70034" ALLUSERS="1" REINSTALLMODE="" ADDLOCAL="Visual_Studio.NET_Pro_x86_enu,VB_for_VS_7_Pro_11320_x86_enu,VB_PowerPacks_for_VS_x86_enu,InstallShield.NewProject.Templates,VCpp_for_VS_7_Pro_x86_enu,FT_VC_Libraries_Core_X86,FT_VC_Libraries_Extended_X86,FT_VC_Libraries_Core,FT_VC_Libraries_Extended,x64_Compilers_and_Tools_5909_x86_enu,FT_VC_Libraries_Core_X64,FT_VC_Libraries_Extended_X64,VCsh_for_VS_7_Pro_810_x86_enu,VS_FSharpBase,BSPkg_FSharpRedist,BSPkg_FSharpRedist2.0,VWD_for_VS_Pro_11324_x86_enu,VS_Office_Dev_Tools_11031_x86_enu,Testing_Tools_for_Pro_x86_enu,Visual_Studio_Graphics_Library_x86_enu,TeamExplorer_enu,FT_VC_VARS_X86_VS,FT_VC_VARS_X86,VS_Remote_Debugging_722_x86_enu,VSA_ENV_x86_enu,WinSDK_All,WinSDK_VSTools.4992,WinSDK_VSWin32Tools.4902,WinSDK_VSHeadersLibs.4897,WinSDK_VS_Headers.4898,WinSDK_VS_Libs_x86.4899,WinSDK_VS_Libs_x64.4900,WinSDK_VS_Libs_ia64.4901,WinSDKIntellisenseRefAssys.4894,WinSDKIntellisense.4895,WinSDKRefAssys.4896,WinSDK_NFXToolsM_DDF,WinSDK_Nfx35ToolsM_DDF,WinSDK_CommonRegistry,WinSDK_PackageRegistry,VS_PRO_enu_x86_net_SETUP,ARP_REG_KEYS_VS_PRO_ENU_X86_x86_enu,Sql_Eulas_For_VSBox_enu,PID_Validation,Servicing_Key,Detection_Keys" VS7.3643236F_FC70_11D3_A536_0090278A1BB8="C:\Program Files (x86)\Microsoft Visual Studio 10.0\" TARGETDIR="C:\" ARPINSTALLLOCATION="C:\Program Files (x86)\Microsoft Visual Studio 10.0\" [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: CRootComponent::Install(): Beginning Brooklyn Component Installation [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] Setup.exe: ISetupManager::GetFullComponents() [05/16/20,01:12:02] DepCheck: gencomp106,{12CDA52C-7A8F-4785-8A22-53C87393FEE0} [05/16/20,01:12:02] DepCheck: gencomp48,{12CDA52C-7A8F-4785-8A22-53C87393FEE0} [05/16/20,01:12:02] DepCheck_Result: 1 [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: Install(): bAttemptInstall: 0 [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: Install(): Not attempting to call MsiInstallProduct()!!!! Baseline not met! [05/16/20,01:12:02] Setup.exe: AddGlobalCustomProperty [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: CRootComponent::Install(): Setup Failed; MSIInstallProduct return value either ERROR_INSTALL_FAILURE or default. [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: validation recorded. [05/16/20,01:12:02] Setup.exe: AddGlobalCustomProperty [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: CRootComponent::Install(): Calling LaunchWatson()... [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: CRootComponent::LaunchWatson() - Obtained CSetupWatson instance [05/16/20,01:12:02] Setup.exe: GetGlobalCustomProperty - Property: {AA62DF98-3F2C-11D3-887B-00C04F8ECDD6} - PropertyName: Maintenance Mode - Value: 0 [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: CRootComponent::LaunchWatson() - Launching VS Watson [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: CRootComponent::LaunchWatson() - Completed VS Watson launching to create manifest: C:\Windows\TEMP\vs_setup.dll.txt [05/16/20,01:12:02] Setup.exe: AddGlobalCustomProperty [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: CRootComponent::LaunchWatson() - Setting the property CustomCoreProp_WatsonManifestReady to use the VS Client Manifest. [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: CRootComponent::Install(): Finished calling LaunchWatson() [05/16/20,01:12:02] Microsoft Visual Studio 2010 Professional - ENU: CRootComponent::Install(): Finished Brooklyn Component Installation [05/16/20,01:12:02] Setup.exe: AddGlobalCustomProperty [05/16/20,01:12:02] setup.exe: ***ERRORLOG EVENT*** : ISetupComponent::Pre/Post/Install() failed in ISetupManager::InternalInstallManager() with HRESULT -2147023293. 
submitted by Grimson89 to SCCM [link] [comments]


2020.04.26 06:04 tonefart You can't be older than early 30s of age.

Greetings from Accion Labs!
Currently we are hiring .Net Developer
Please Find the Below JD for your perusal,
Job Title : .Net Developer
Job Location : Kota Damansara , Petaling Jaya, Malaysia
Job Type : Permanent Position
Job Description
Requirements:
· Possess a Higher Diploma/Degree in Software Engineering or equivalent
· Strong technical skill sets - MS SQL Server, MySQL, XML, VB.NET, ASP.NET, C#, .NET Framework
· Minimum 4 years solid hands-on experience in .Net software development projects
· Must be Malaysian with early 30 of age
· Career minded, willing to take up challenges and work on flexible hours
· Strong analytical mind and fast learner on new technologies
· Hands-on in architectural design and program coding base on technical specification
· Knowledge on CMMi process framework and ISO 9001 is an advantage
· Team player with right working attitude
For further official discussion Kindly drop a mail with updated CV and below details ASAP,
Full Name :
Passport Validity :
Total Years of Experience: Relevant Years of Experience:
Current Company: Current Salary: Expecting Salary: Notice Period : Current Location: Reason for Job change:
Appreciate your response & please do recommend your friends with relevant experience
submitted by tonefart to recruitinghell [link] [comments]


2020.03.29 07:59 acunningham Enswitch 4.1

Integrics are pleased to announce version 4.1 of Enswitch, the most feature-rich, most powerful, and most flexible solution for commercial telephony services such as multi-tenant hosted PBX, residential ITSP, toll-free, and number translation services.
Enswitch provides full-featured telephony, billing, invoicing, a full and highly configurable web interface for the system owner, resellers, and customers, and a comprehensive API - all in a single highly scalable and highly integrated solution. It's in production today with carriers worldwide on systems from hundreds of users on single machines to over 200,000 users on large redundant clusters. It's also increasingly used as a highly scalable PBX for those who need a system distributed across many locations, or who need to bill departments or users.
More information, including a full list of features and a working demo of the web interface, is at https://enswitch.com/
NEW FEATURES IN ENSWITCH 4.1
Telephony
Speech recognition for IVR menus and dial by name. Calls that have already been answered can be picked up. Queues can play the expected wait time. Hunt groups can have separate destinations for all busy or all unregistered. Ordinary users can change their callerid if allowed. Individual telephone lines can have DTMF transfers enabled or disabled. Handsets can monitor multiple mailboxes for MWI. Call recordings can be paused and resumed via DTMF and SIP INFO. Confirmation emails of outbound faxes can be sent, showing the status of the sent fax. Time groups can be shared with sub-customers. Rate plan routing exceptions can route on callerid. Individual peer costs can allow or disallow least cost routing. Provisioning returns an HTTP 404 error if the MAC address is not found. Provisioning can have a password in the URL. Extra debug logging for ask web URL, MWI, busy lamps, and text messages.
Web interface and administration
The add user wizard can be used without creating a person. Users' passwords are always encrypted. Passwords must be seven characters or longer by default. Users are locked out after six incorrect passwords by default. Usernames can be forbidden in users' and telephone lines' passwords. Passwords can be validated against an external validation service. Credit card details are hidden on the web after saving them. Non-error messages on the web interface are now pop-ups that disappear after a short time. The control panel is faster and places less load on the database server when used by many users. Call costs can be hidden if billing is not configured. Inbound groups can be searched for specific numbers. The call history can show either calls or call legs. Ordinary users can see all the legs of a call in the call history if to or from a telephone line they own. The call history can be filtered on non-internal calls. Charges can be filtered on a variety of fields. Individual DNS domains can allow or disallow signup. Audit log data can be exported.
Billing and payments
Credit card verification numbers are never stored. Credit card details are removed from customers' accounts when changing their billing type to none. Credit card details are not included by default in customer exports. Updates to customers' credit cards can be pulled from Authorize.net, ensuring their credit cards never expire.
APIs
The JSON API can accept token authentication, speeding it up and reducing load on the database server.
System management
Individual Asterisk machines can be set to audio, video, or both. Logging can be flexibly configured to log to arbitrary files, or syslogd, in any desired format.
submitted by acunningham to enswitch [link] [comments]


2020.03.24 19:26 jamesthegill Brits n' Pieces - The Pipettes

Who are The Pipettes?

They are The Pipettes. A Brighton-based pop-manifesto 60’s-throwback band with a singer turnover that makes the Sugababes look like a job for life. Founded in the Basketmakers pub in Brighton, which is yet to gain a blue plaque despite my ongoing efforts, the band flared brightly and briefly but at the right time to capture the heart of indieheads who didn’t want to admit they liked pop.

Version 0.8

I’m gonna begin this story of a proclaimed feminist girl group by talking about a man. Monster Bobby, popular DJ in indie clubs in Brighton, noticed how crazy people would go for 60s girl groups during club nights (think The Ronettes, Shangri-Las - anything that started with a kickdrum sound, basically) and spotting a gap in the indie/pop landscape, set about manufacturing one. A few pints in the pub and the band came together. Playing live, the girls would be backed by an all-male group called The Cassettes, who were only mentioned in throwaway lines in articles - it was all about the three polka-dotted singers. Hidden away on the band’s website was their manifesto, proclaiming that pop shouldn’t be a dirty word in music, and has just as valid a voice as punk.
The beta-version of the group made some small waves on the indie scene, touring with The Go! Team and picking up praise in the Brighton area. Apart from a Christmas song that I cannot find on Youtube anywhere, the band’s only release with the original line-up was the bisexual teenage-crushing anthem, I Like A Boy In Uniform (School Uniform). Fun, frivolous and not as innocent as first appeared when you pay attention to the lyrics, it set down a marker of what The Pipettes were all about.
After six months Julia departed to concentrate on her other band, The Indelicates, but not before a small UK tour. A set in Cardiff captured the attention of a young blonde Welsh singer, and when the vacancy arose, she put her name forward. Gwenno Saunders joined the band and the trio were complete.

They are The Pipettes: Gwenno! Riot Becki! Rosay!

L-R: Rosay, Gwenno, Riot Becki
With the new line-up, things clicked into place. The band signed to an indie label and shortly after released their debut album, We Are The Pipettes. After a couple of subversive singles (Dirty Mind, about a boy with a dirty mind, and Judy, an anthem for that awkward idolisation of a cooler, confident, older girl, where you’re not quite sure if you want to be her or sleep with her), the band hit “big” with the song everyone knows them for - Pull Shapes.
Pull Shapes may not have been invented as the band’s signature song but it certainly ended up that way. It encapsulates everything great about music - how you can shut yourself away from your troubles, and dance the pain away, just give yourself in to three minutes of joyous pop and everything will seem that little bit sunnier afterwards. It distills the best bits of the ‘60s girl group genre: It’s superficially simple yet elaborate, catchy, and most importantly, so much fun. Pull Shapes was definitely the centrepiece of the album, but the rest of the songs supported it - previous singles ABC, Dirty Mind and Judy cropped up (but not School Uniform - it left the group with Julia and they’ve not performed or released it since) alongside kiss-off anthem Your Kisses Are Wasted On Me and a tracklisting that gets hornier as it goes along!
The album was something that was new and contemporary, while managing to harken back to the sound’s origins. It was fresh in an indie landscape dominated by nasally boys with guitars. It was remixed for the US release by Greg Wells (Greatest Showman, Waking Up In Vegas) which is worth checking out if you’re curious - it’s not necessarily better or worse than the original, but different enough to be interesting. It just failed to make the top 40 in the UK which was pretty good by indie standards - despite promo for the album including a stint on BBC news for the south east of England! The band toured extensively worldwide promoting the album.
It was around this time that cracks started to appear. As Gwenno put it:
“Rose, Becki and I constantly had discussions on whether it was empowering or not,” she says. “We were deadly serious about being flippant, and I found that to be empowering, but it also became very restricting in the end, because you were always playing a role. For all three of us as the front women, it was very difficult to subvert such a traditional gendered role, for as much as we tried. You couldn’t really be yourself when you were wearing the polka-dot dress.
While the harmonised vocals complimented each other, when it came to solos Gwenno was very much the Rachel Berry of the group, and Mutya & Keisha Rose and Becki left to pursue other careers outside the band. All three original members were no longer part of the band, and I feel like further Sugababes jokes are just going to be superfluous.

Gwenno, Ani, and A. N. Other

Fortunately they were quickly replaced! Ani Saunders (Gwenno’s younger sister) and Anna McDonald joined the band for some support slots and promotional appearances. Anna left after six months, making her the ideal answer to “name a member of The Pipettes” in a round of Pointless. She was replaced by Beth Mburu-Bowie, who lasted a bit longer (eight months!), recorded the second album and promptly had all of her vocals expunged from the album after she left. The album Earth vs The Pipettes was rerecorded as a duo and released in 2010, about a decade too early for the 80s throwback sound it incorporated. Borrowing heavily from Stock-Aitken-Waterman era Kylie, the record managed to sound dated at the time, but now sounds contemporary in the current 80s revival! It’s still not a Pipettes album, or particularly good, but was slightly ahead of its time in how far behind the times it was.
After the album flopped the band split - which must make Christmas dinners awkward in the Saunders household - and went their separate ways musically. Julia has released six albums with The Indelicates, continuing her tongue-in-cheek voice without stifling her political side. Rosay has gone synthier and darker, releasing a couple of albums under her full name Rose Elinor Dougall and earning some extra dosh as a member of Mark Ronson’s touring band. Gwenno had a brief dalliance with electro-pop (ZIP download) before aiming squarely for mainstream success by releasing critically acclaimed albums in Welsh and Cornish languages. Rebecca Stephens (nee Riot!Becki) had a few solo acts before starting a family and campaigning for Labour. Beth Mburu-Bowie has worked with Friendly Fires and Metronomy on various songs. Ani Saunders has also released a Welsh language album and is doing her PhD. Anna McDonald appears to be a vocalist for hire.
The Pipettes. A possibly sexist 60s throwback group in a world that didn’t realise it needed one, they made their mark in the hearts of a small but appreciative group of fans. Even now the album’s great as a pick-me-up, a listen usually sparked by seeing polka-dots!

Other tracks you might like:

Simon Says - 60s pop with a BDSM twist Really That Bad - “he’s such a bad boy, if only someone could tame him!” Feminist Complaints - a throwback to the Spector-esque wall of sound, erm, sound, but with added C-words (naughty language, don’t listen if you’re not allowed) Guess Who Ran Off With The Milkman? - a great subversion of the era they’re throwing back to, rather than pining for a boy to be forever betrothed to, our heroine decides she’d quite like to meet other boys and girls instead.

Discussion questions

Hello! This is the first, but hopefully not the last, of a series of pieces about British pop-adjacent things that maybe didn’t break out, but that I think Popheads might be interested in. Let’s face it, we’re all bored under lockdown, so I may as well put my time to good use! I have a couple of ideas lined up for the next few columns but I’m open to suggestions from others. This will continue as long as I'm not bored, out of ideas or dead.
submitted by jamesthegill to popheads [link] [comments]


2020.03.24 19:12 jamesthegill Brits n' Pieces - The Pipettes

Who are The Pipettes?
They are The Pipettes. A Brighton-based pop-manifesto 60’s-throwback band with a singer turnover that makes the Sugababes look like a job for life. Founded in the Basketmakers pub in Brighton, which is yet to gain a blue plaque despite my ongoing efforts, the band flared brightly and briefly but at the right time to capture the heart of indieheads who didn’t want to admit they liked pop.

Version 0.8

I’m gonna begin this story of a proclaimed feminist girl group by talking about a man. Monster Bobby, popular DJ in indie clubs in Brighton, noticed how crazy people would go for 60s girl groups during club nights (think The Ronettes, Shangri-Las - anything that started with a kickdrum sound, basically) and spotting a gap in the indie/pop landscape, set about manufacturing one. A few pints in the pub and the band came together. Playing live, the girls would be backed by an all-male group called The Cassettes, who were only mentioned in throwaway lines in articles - it was all about the three polka-dotted singers. Hidden away on the band’s website was their manifesto, proclaiming that pop shouldn’t be a dirty word in music, and has just as valid a voice as punk.
The beta-version of the group made some small waves on the indie scene, touring with The Go! Team and picking up praise in the Brighton area. Apart from a Christmas song that I cannot find on Youtube anywhere, the band’s only release with the original line-up was the bisexual teenage-crushing anthem, I Like A Boy In Uniform (School Uniform). Fun, frivolous and not as innocent as first appeared when you pay attention to the lyrics, it set down a marker of what The Pipettes were all about.
After six months Julia departed to concentrate on her other band, The Indelicates, but not before a small UK tour. A set in Cardiff captured the attention of a young blonde Welsh singer, and when the vacancy arose, she put her name forward. Gwenno Saunders joined the band and the trio were complete.

They are The Pipettes: Gwenno! Riot Becki! Rosay!

L-R: Rosay, Gwenno, Riot Becki
With the new line-up, things clicked into place. The band signed to an indie label and shortly after released their debut album, We Are The Pipettes. After a couple of subversive singles (Dirty Mind, about a boy with a dirty mind, and Judy, an anthem for that awkward idolisation of a cooler, confident, older girl, where you’re not quite sure if you want to be her or sleep with her), the band hit “big” with the song everyone knows them for - Pull Shapes.
Pull Shapes may not have been invented as the band’s signature song but it certainly ended up that way. It encapsulates everything great about music - how you can shut yourself away from your troubles, and dance the pain away, just give yourself in to three minutes of joyous pop and everything will seem that little bit sunnier afterwards. It distills the best bits of the ‘60s girl group genre: It’s superficially simple yet elaborate, catchy, and most importantly, so much fun. Pull Shapes was definitely the centrepiece of the album, but the rest of the songs supported it - previous singles ABC, Dirty Mind and Judy cropped up (but not School Uniform - it left the group with Julia and they’ve not performed or released it since) alongside kiss-off anthem Your Kisses Are Wasted On Me and a tracklisting that gets hornier as it goes along!
The album was something that was new and contemporary, while managing to harken back to the sound’s origins. It was fresh in an indie landscape dominated by nasally boys with guitars. It was remixed for the US release by Greg Wells (Greatest Showman, Waking Up In Vegas) which is worth checking out if you’re curious - it’s not necessarily better or worse than the original, but different enough to be interesting. It just failed to make the top 40 in the UK which was pretty good by indie standards - despite promo for the album including a stint on BBC news for the south east of England! The band toured extensively worldwide promoting the album.
It was around this time that cracks started to appear. As Gwenno put it:
“Rose, Becki and I constantly had discussions on whether it was empowering or not,” she says. “We were deadly serious about being flippant, and I found that to be empowering, but it also became very restricting in the end, because you were always playing a role. For all three of us as the front women, it was very difficult to subvert such a traditional gendered role, for as much as we tried. You couldn’t really be yourself when you were wearing the polka-dot dress.
While the harmonised vocals complimented each other, when it came to solos Gwenno was very much the Rachel Berry of the group, and Mutya & Keisha Rose and Becki left to pursue other careers outside the band. All three original members were no longer part of the band, and I feel like further Sugababes jokes are just going to be superfluous.

Gwenno, Ani, and A. N. Other

Fortunately they were quickly replaced! Ani Saunders (Gwenno’s younger sister) and Anna McDonald joined the band for some support slots and promotional appearances. Anna left after six months, making her the ideal answer to “name a member of The Pipettes” in a round of Pointless. She was replaced by Beth Mburu-Bowie, who lasted a bit longer (eight months!), recorded the second album and promptly had all of her vocals expunged from the album after she left. The album Earth vs The Pipettes was rerecorded as a duo and released in 2010, about a decade too early for the 80s throwback sound it incorporated. Borrowing heavily from Stock-Aitken-Waterman era Kylie, the record managed to sound dated at the time, but now sounds contemporary in the current 80s revival! It’s still not a Pipettes album, or particularly good, but was slightly ahead of its time in how far behind the times it was.
After the album flopped the band split - which must make Christmas dinners awkward in the Saunders household - and went their separate ways musically. Julia has released six albums with The Indelicates, continuing her tongue-in-cheek voice without stifling her political side. Rosay has gone synthier and darker, releasing a couple of albums under her full name Rose Elinor Dougall and earning some extra dosh as a member of Mark Ronson’s touring band. Gwenno had a brief dalliance with electro-pop (ZIP download) before aiming squarely for mainstream success by releasing critically acclaimed albums in Welsh and Cornish languages. Rebecca Stephens (nee Riot!Becki) had a few solo acts before starting a family and campaigning for Labour. Beth Mburu-Bowie has worked with Friendly Fires and Metronomy on various songs. Ani Saunders has also released a Welsh language album and is doing her PhD. Anna McDonald appears to be a vocalist for hire.
The Pipettes. A possibly sexist 60s throwback group in a world that didn’t realise it needed one, they made their mark in the hearts of a small but appreciative group of fans. Even now the album’s great as a pick-me-up, a listen usually sparked by seeing polka-dots!

Other tracks you might like:

Simon Says - 60s pop with a BDSM twist Really That Bad - “he’s such a bad boy, if only someone could tame him!” Feminist Complaints - a throwback to the Spector-esque wall of sound, erm, sound, but with added C-words (naughty language, don’t listen if you’re not allowed) Guess Who Ran Off With The Milkman? - a great subversion of the era they’re throwing back to, rather than pining for a boy to be forever betrothed to, our heroine decides she’d quite like to meet other boys and girls instead.

Discussion questions

Hello! This is the first, but hopefully not the last, of a series of pieces about British pop-adjacent things that maybe didn’t break out, but that I think Popheads might be interested in. Let’s face it, we’re all bored under lockdown, so I may as well put my time to good use! I have a couple of ideas lined up for the next few columns but I’m open to suggestions from others. This will continue as long as I'm not bored, out of ideas or dead.
submitted by jamesthegill to icannotwhegill [link] [comments]


VIES - European Commission